Training Neural Networks by Lifted Proximal Operator Machines

IEEE Transactions on Pattern Analysis and Machine Intelligence(2022)

引用 17|浏览107
暂无评分
摘要
We present the lifted proximal operator machine (LPOM) to train fully-connected feed-forward neural networks. LPOM represents the activation function as an equivalent proximal operator and adds the proximal operators to the objective function of a network as penalties. LPOM is block multi-convex in all layer-wise weights and activations. This allows us to develop a new block coordinate descent (BC...
更多
查看译文
关键词
Training,Artificial neural networks,Linear programming,Convergence,Tuning,Standards,Patents
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要