Accelerating Data Generation for Neural Operators via Krylov Subspace Recycling
CoRR(2024)
摘要
Learning neural operators for solving partial differential equations (PDEs)
has attracted great attention due to its high inference efficiency. However,
training such operators requires generating a substantial amount of labeled
data, i.e., PDE problems together with their solutions. The data generation
process is exceptionally time-consuming, as it involves solving numerous
systems of linear equations to obtain numerical solutions to the PDEs. Many
existing methods solve these systems independently without considering their
inherent similarities, resulting in extremely redundant computations. To tackle
this problem, we propose a novel method, namely Sorting Krylov Recycling (SKR),
to boost the efficiency of solving these systems, thus significantly
accelerating data generation for neural operators training. To the best of our
knowledge, SKR is the first attempt to address the time-consuming nature of
data generation for learning neural operators. The working horse of SKR is
Krylov subspace recycling, a powerful technique for solving a series of
interrelated systems by leveraging their inherent similarities. Specifically,
SKR employs a sorting algorithm to arrange these systems in a sequence, where
adjacent systems exhibit high similarities. Then it equips a solver with Krylov
subspace recycling to solve the systems sequentially instead of independently,
thus effectively enhancing the solving efficiency. Both theoretical analysis
and extensive experiments demonstrate that SKR can significantly accelerate
neural operator data generation, achieving a remarkable speedup of up to 13.9
times.
更多查看译文
关键词
AI4PDE,Neural Operator,Data Generation,Krylov Subspace
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要