Differentiable Multi-Fidelity Fusion: Efficient Learning of Physics Simulations with Neural Architecture Search and Transfer Learning

CoRR(2023)

引用 0|浏览20
暂无评分
摘要
With rapid progress in deep learning, neural networks have been widely used in scientific research and engineering applications as surrogate models. Despite the great success of neural networks in fitting complex systems, two major challenges still remain: i) the lack of generalization on different problems/datasets, and ii) the demand for large amounts of simulation data that are computationally expensive. To resolve these challenges, we propose the differentiable \mf (DMF) model, which leverages neural architecture search (NAS) to automatically search the suitable model architecture for different problems, and transfer learning to transfer the learned knowledge from low-fidelity (fast but inaccurate) data to high-fidelity (slow but accurate) model. Novel and latest machine learning techniques such as hyperparameters search and alternate learning are used to improve the efficiency and robustness of DMF. As a result, DMF can efficiently learn the physics simulations with only a few high-fidelity training samples, and outperform the state-of-the-art methods with a significant margin (with up to 58$\%$ improvement in RMSE) based on a variety of synthetic and practical benchmark problems.
更多
查看译文
关键词
neural architecture search,efficient learning,simulations,physics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要