Convergence Analysis of Flow Matching in Latent Space with Transformers
arxiv(2024)
摘要
We present theoretical convergence guarantees for ODE-based generative
models, specifically flow matching. We use a pre-trained autoencoder network to
map high-dimensional original inputs to a low-dimensional latent space, where a
transformer network is trained to predict the velocity field of the
transformation from a standard normal distribution to the target latent
distribution. Our error analysis demonstrates the effectiveness of this
approach, showing that the distribution of samples generated via estimated ODE
flow converges to the target distribution in the Wasserstein-2 distance under
mild and practical assumptions. Furthermore, we show that arbitrary smooth
functions can be effectively approximated by transformer networks with
Lipschitz continuity, which may be of independent interest.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要