Multistep Consistency Models
arxiv(2024)
摘要
Diffusion models are relatively easy to train but require many steps to
generate samples. Consistency models are far more difficult to train, but
generate samples in a single step.
In this paper we propose Multistep Consistency Models: A unification between
Consistency Models (Song et al., 2023) and TRACT (Berthelot et al., 2023) that
can interpolate between a consistency model and a diffusion model: a trade-off
between sampling speed and sampling quality. Specifically, a 1-step consistency
model is a conventional consistency model whereas we show that a ∞-step
consistency model is a diffusion model.
Multistep Consistency Models work really well in practice. By increasing the
sample budget from a single step to 2-8 steps, we can train models more easily
that generate higher quality samples, while retaining much of the sampling
speed benefits. Notable results are 1.4 FID on Imagenet 64 in 8 step and 2.1
FID on Imagenet128 in 8 steps with consistency distillation. We also show that
our method scales to a text-to-image diffusion model, generating samples that
are very close to the quality of the original model.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要