Some Fundamental Aspects about Lipschitz Continuity of Neural Networks
arxiv(2023)
摘要
Lipschitz continuity is a crucial functional property of any predictive
model, that naturally governs its robustness, generalisation, as well as
adversarial vulnerability. Contrary to other works that focus on obtaining
tighter bounds and developing different practical strategies to enforce certain
Lipschitz properties, we aim to thoroughly examine and characterise the
Lipschitz behaviour of Neural Networks. Thus, we carry out an empirical
investigation in a range of different settings (namely, architectures,
datasets, label noise, and more) by exhausting the limits of the simplest and
the most general lower and upper bounds. As a highlight of this investigation,
we showcase a remarkable fidelity of the lower Lipschitz bound, identify a
striking Double Descent trend in both upper and lower bounds to the Lipschitz
and explain the intriguing effects of label noise on function smoothness and
generalisation.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要