Publishing Service

Polishing & Checking

Frontiers of Information Technology & Electronic Engineering

ISSN 2095-9184 (print), ISSN 2095-9230 (online)

HADF: a hash-adaptive dual fusion implicit network for super-resolution of turbulent flows

Abstract: Turbulence, a complex multi-scale phenomenon inherent in fluid flow systems, presents critical challenges and opportunities for understanding physical mechanisms across scientific and engineering domains. Although high-resolution (HR) turbulence data remain indispensable for advancing both theoretical insights and engineering solutions, their acquisition is severely limited by prohibitively high computational costs. While deep learning architectures show transformative potential in reconstructing high-fidelity flow representations from sparse measurements, current methodologies suffer from two inherent constraints: strict reliance on perfectly paired training data and inability to perform multi-scale reconstruction within a unified framework. To address these challenges, we propose HADF, a hash-adaptive dynamic fusion implicit network for turbulence reconstruction. Specifically, we develop a low-resolution (LR) consistency loss that facilitates effective model training under conditions of missing paired data, eliminating the conventional requirement for fully matched LR and HR datasets. We further employ hash-adaptive spatial encoding and dynamic feature fusion to extract turbulence features, mapping them with implicit neural representations for reconstruction at arbitrary resolutions. Experimental results demonstrate that HADF achieves superior performance in global reconstruction accuracy and local physical properties compared to state-of-the-art models. It precisely recovers fine turbulence details for partially unpaired data conditions and diverse resolutions by training only once while maintaining robustness against noise.

Key words: Turbulence reconstruction; Deep learning; Unpaired data; Low-resolution consistency loss; Hash-adaptive spatial encoding; Dynamic feature fusion; Implicit neural representations

Chinese Summary  <1> HADF:面向湍流重构的哈希自适应双融合隐式网络

刘云飞1,2,3,陈新海1,2,3,张根1,2,3,张庆阳1,2,3,王庆林1,2,3,刘杰1,2,3
1国防科技大学并行与分布计算全国重点实验室,湖南长沙市,410073
2国防科技大学高端装备数字化软件湖南省重点实验室,湖南长沙市,410073
3国防科技大学计算机学院,湖南长沙市,410073
摘要:湍流作为流体系统中一种复杂的多尺度现象,在科学与工程领域中理解其物理机制具有重大挑战,同时也带来重要机遇。尽管高分辨率湍流数据对于深化理论研究和推动工程应用具有关键意义,但其获取过程受到高计算成本的限制。近年来,深度学习方法在从稀疏测量中重构高保真流场方面展现出显著潜力,然而现有方法普遍存在两大局限:一是过度依赖完美配对的训练数据,二是难以在统一框架下实现多尺度重构。针对上述问题,本文提出一种面向湍流重构的哈希自适应双融合隐式网络—HADF。该方法引入低分辨率一致性损失,使模型能够在部分缺失配对数据的情况下实现稳定训练,从而摆脱对完全匹配的低分辨率和高分辨率数据集的依赖。同时,HADF结合哈希自适应空间编码与动态特征融合机制以高效提取湍流特征,并通过隐式神经表示实现任意分辨率下的连续重构。实验结果表明,HADF在全局重构精度与局部物理一致性方面均优于现有最先进的模型。该方法能够在仅一次训练的情况下,对部分未配对数据和多分辨率场景实现精确的湍流细节重构,并在存在噪声的条件下保持出色的鲁棒性。

关键词组:湍流重构;深度学习;未配对数据;低分辨率一致性损失;哈希自适应空间编码;动态特征融合;隐式神经表示


Share this article to: More

Go to Contents

References:

<Show All>

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





DOI:

10.1631/FITEE.2500419

CLC number:

TP391.4;O35

Download Full Text:

Click Here

Downloaded:

109

Download summary:

<Click Here> 

Downloaded:

136

Clicked:

242

Cited:

0

On-line Access:

2026-01-08

Received:

2025-06-17

Revision Accepted:

2025-10-24

Crosschecked:

2026-01-08

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952276; Fax: +86-571-87952331; E-mail: jzus@zju.edu.cn
Copyright © 2000~ Journal of Zhejiang University-SCIENCE