Bounds on the Approximation Power of Feedforward Neural Networks

ICML(2018)

引用 23|浏览20
暂无评分
摘要
The approximation power of general feedforward neural networks with piecewise linear activation functions is investigated. First, lower bounds on the size of a network are established in terms of the approximation error and network depth and width. These bounds improve upon state-of-the-art bounds for certain classes of functions, such as strongly convex functions. Second, an upper bound is established on the difference of two neural networks with identical weights but different activation functions.
更多
查看译文
关键词
feedforward neural networks,neural networks,approximation power
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要