谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Rethinking The Pid Optimizer For Stochastic Optimization Of Deep Networks

2020 IEEE International Conference on Multimedia and Expo (ICME)(2020)

引用 8|浏览32
暂无评分
摘要
Stochastic gradient descent with momentum (SGD-Momentum) always causes the overshoot problem due to the integral action of the momentum term. Recently, an ID optimizer is proposed to solve the overshoot problem with the help of derivative information. However, the derivative term suffers from the interference of the high-frequency noise, especially for the stochastic gradient descent method that uses minibatch data in each update step. In this work, we propose a complete PID optimizer, which weakens the effect of the D term and adds a P term to more stably alleviate the overshoot problem. To further reduce the interference of the high-frequency noise, two effective and efficient methods are proposed to stabilize the training process. Extensive experiments on three widely used benchmark datasets with different scales, i.e., MNIST, Cifar10 and TinyImageNet, demonstrate the superiority of our proposed PID optimizer on various popular deep neural networks.
更多
查看译文
关键词
SGD,PID,Optimizer,Deep Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要