An Information-Theoretic Framework for Out-of-Distribution Generalization
arxiv(2024)
摘要
We study the Out-of-Distribution (OOD) generalization in machine learning and
propose a general framework that provides information-theoretic generalization
bounds. Our framework interpolates freely between Integral Probability Metric
(IPM) and f-divergence, which naturally recovers some known results
(including Wasserstein- and KL-bounds), as well as yields new generalization
bounds. Moreover, we show that our framework admits an optimal transport
interpretation. When evaluated in two concrete examples, the proposed bounds
either strictly improve upon existing bounds in some cases or recover the best
among existing OOD generalization bounds.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要