Functions with average smoothness - structure, algorithms, and learning.

COLT(2021)

引用 8|浏览41
暂无评分
摘要
We initiate a program of average-smoothness analysis for efficiently learning real-valued functions on metric spaces. Rather than using the (global) Lipschitz constant as the regularizer, we define a local slope at each point and gauge the function complexity as the average of these values. Since the average is often much smaller than the maximum, this complexity measure can yield considerably sharper generalization bounds --- assuming that these admit a refinement where the global Lipschitz constant is replaced by our average of local slopes. Our first major contribution is to obtain just such distribution-sensitive bounds. This required overcoming a number of technical challenges, perhaps the most significant of which was bounding the {\em empirical} covering numbers, which can be much worse-behaved than the ambient ones. This in turn is based on a novel Lipschitz-type extension, which is a pointwise minimizer of the local slope, and may be of independent interest. Our combinatorial results are accompanied by efficient algorithms for denoising the random sample, as well as guarantees that the extension from the sample to the whole space will continue to be, with high probability, smooth on average. Along the way we discover a surprisingly rich combinatorial and analytic structure in the function class we define.
更多
查看译文
关键词
average smoothness,algorithms,functions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要