Matryoshka Neural Operators: Learning Fast PDE Solvers for Multiscale Physics

crossref(2022)

引用 0|浏览0
暂无评分
摘要
<p>Running a high-resolution global climate model can take multiple days on the world's largest supercomputers. Due to the long runtimes that are caused by solving the underlying partial differential equations (PDEs), climate researchers struggle to generate ensemble runs that are necessary for uncertainty quantification or exploring climate policy decisions.</p><p>&#160;</p><p>Physics-informed neural networks (PINNs) promise a solution: they can solve single instances of PDEs up to three orders of magnitude faster than traditional finite difference numerical solvers. However, most approaches in physics-informed machine learning learn the solution of PDEs over the full spatio-temporal domain, which requires infeasible amounts of training data, does not exploit knowledge of the underlying large-scale physics, and reduces model trust. Our philosophy is to limit learning to the hard-to-model parts. Hence, we are proposing a novel method called \textit{matryoshka neural operator} that leverages an old scheme called super-parametrizations developed in geophysical fluid dynamics. Using this scheme our proposed physics-informed architecture exploits knowledge of approximate large-scale dynamics and only learns the influence of small-scale dynamics onto large-scale dynamics, also called subgrid parametrizations.</p><p>&#160;</p><p>Some work in geophysical fluid dynamics is conceptually similar, but fully relies on neural networks which can only operate on fixed grids (Gentine et al., 2018). We are the first to learn grid-independent subgrid parametrizations by leveraging neural operators that learn the dynamics in a grid-independent latent space. Neural operators can be seen as an extension of neural networks to infinite-dimensions: They encode infinite-dimensional inputs into a finite-dimensional representations, such as Eigen or Fourier modes, and learn the nonlinear temporal dynamics in the encoded state.</p><p>&#160;</p><p>We demonstrate the neural operators for learning non-local subgrid parametrizations over the full large-scale domain of the two-scale Lorenz96 equation. We show that the proposed learning-based PDE solver is grid-independent, has quasilinear instead of quadratic complexity in comparison to a fully-resolving numerical solver, is more accurate than current neural network or polynomial-based parametrizations, and offers interpretability through Fourier modes.</p><p>&#160;</p><p>Gentine, P., Pritchard, M., Rasp, S., Reinaudi, G., and Yacalis, G. (2018). Could machine learning break the convection parameterization deadlock? Geophysical Research Letters, 45, 5742&#8211; 5751. https://doi.org/10.1029/2018GL078202</p>
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要