Deep double descent: where bigger models and more data hurt*

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT(2020)

引用 848|浏览3
暂无评分
摘要
We show that a variety of modern deep learning tasks exhibit a 'double-descent' phenomenon where, as we increase model size, performance first gets worse and then gets better. Moreover, we show that double descent occurs not just as a function of model size, but also as a function of the number of training epochs. We unify the above phenomena by defining a new complexity measure we call the effective model complexity and conjecture a generalized double descent with respect to this measure. Furthermore, our notion of model complexity allows us to identify certain regimes where increasing (even quadrupling) the number of train samples actually hurts test performance.
更多
查看译文
关键词
deep learning, machine learning, statistical inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要