Score-based Conditional Generation with Fewer Labeled Data by Self-calibrating Classifier Guidance
CoRR(2023)
摘要
Score-based generative models (SGMs) are a popular family of deep generative
models that achieve leading image generation quality. Early studies extend SGMs
to tackle class-conditional generation by coupling an unconditional SGM with
the guidance of a trained classifier. Nevertheless, such classifier-guided SGMs
do not always achieve accurate conditional generation, especially when trained
with fewer labeled data. We argue that the problem is rooted in the
classifier's tendency to overfit without coordinating with the underlying
unconditional distribution. To make the classifier respect the unconditional
distribution, we propose improving classifier-guided SGMs by letting the
classifier regularize itself. The key idea of our proposed method is to use
principles from energy-based models to convert the classifier into another view
of the unconditional SGM. Existing losses for unconditional SGMs can then be
leveraged to achieve regularization by calibrating the classifier's internal
unconditional scores. The regularization scheme can be applied to not only the
labeled data but also unlabeled ones to further improve the classifier. Across
various percentages of fewer labeled data, empirical results show that the
proposed approach significantly enhances conditional generation quality. The
enhancements confirm the potential of the proposed self-calibration technique
for generative modeling with limited labeled data.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要