Learning sparse and smooth functions by deep Sigmoid nets

Applied Mathematics A Journal of Chinese Universities,B(2023)

引用 1|浏览0
暂无评分
摘要
To pursue the outperformance of deep nets in learning, we construct a deep net with three hidden layers and prove that, implementing the empirical risk minimization (ERM) on this deep net, the estimator can theoretically realize the optimal learning rates without the classical saturation problem. In other words, deepening the networks with only three hidden layers can overcome the saturation and not degrade the optimal learning rates. The obtained results underlie the success of deep nets and provide a theoretical guidance for deep learning.
更多
查看译文
关键词
generalization,deep learning,deep neural networks,learning rate,sparse
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要