Leveraging PAC-Bayes Theory and Gibbs Distributions for Generalization Bounds with Complexity Measures
CoRR(2024)
摘要
In statistical learning theory, a generalization bound usually involves a
complexity measure imposed by the considered theoretical framework. This limits
the scope of such bounds, as other forms of capacity measures or
regularizations are used in algorithms. In this paper, we leverage the
framework of disintegrated PAC-Bayes bounds to derive a general generalization
bound instantiable with arbitrary complexity measures. One trick to prove such
a result involves considering a commonly used family of distributions: the
Gibbs distributions. Our bound stands in probability jointly over the
hypothesis and the learning sample, which allows the complexity to be adapted
to the generalization gap as it can be customized to fit both the hypothesis
class and the task.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要