Linear Combination of Saved Checkpoints Makes Consistency and Diffusion Models Better
arxiv(2024)
摘要
Diffusion Models (DM) and Consistency Models (CM) are two types of popular
generative models with good generation quality on various tasks. When training
DM and CM, intermediate weight checkpoints are not fully utilized and only the
last converged checkpoint is used. In this work, we find that high-quality
model weights often lie in a basin which cannot be reached by SGD but can be
obtained by proper checkpoint averaging. Based on these observations, we
propose LCSC, a simple but effective and efficient method to enhance the
performance of DM and CM, by combining checkpoints along the training
trajectory with coefficients deduced from evolutionary search. We demonstrate
the value of LCSC through two use cases: (a) Reducing training cost.
With LCSC, we only need to train DM/CM with fewer number of iterations and/or
lower batch sizes to obtain comparable sample quality with the fully trained
model. For example, LCSC achieves considerable training speedups for CM
(23× on CIFAR-10 and 15× on ImageNet-64). (b) Enhancing
pre-trained models. Assuming full training is already done, LCSC can further
improve the generation quality or speed of the final converged models. For
example, LCSC achieves better performance using 1 number of function evaluation
(NFE) than the base model with 2 NFE on consistency distillation, and decreases
the NFE of DM from 15 to 9 while maintaining the generation quality on
CIFAR-10. Our code is available at
https://github.com/imagination-research/LCSC.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要