SineNet: Learning Temporal Dynamics in Time-Dependent Partial Differential Equations
arxiv(2024)
摘要
We consider using deep neural networks to solve time-dependent partial
differential equations (PDEs), where multi-scale processing is crucial for
modeling complex, time-evolving dynamics. While the U-Net architecture with
skip connections is commonly used by prior studies to enable multi-scale
processing, our analysis shows that the need for features to evolve across
layers results in temporally misaligned features in skip connections, which
limits the model's performance. To address this limitation, we propose SineNet,
consisting of multiple sequentially connected U-shaped network blocks, referred
to as waves. In SineNet, high-resolution features are evolved progressively
through multiple stages, thereby reducing the amount of misalignment within
each stage. We furthermore analyze the role of skip connections in enabling
both parallel and sequential processing of multi-scale information. Our method
is rigorously tested on multiple PDE datasets, including the Navier-Stokes
equations and shallow water equations, showcasing the advantages of our
proposed approach over conventional U-Nets with a comparable parameter budget.
We further demonstrate that increasing the number of waves in SineNet while
maintaining the same number of parameters leads to a monotonically improved
performance. The results highlight the effectiveness of SineNet and the
potential of our approach in advancing the state-of-the-art in neural PDE
solver design. Our code is available as part of AIRS
(https://github.com/divelab/AIRS).
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要