ASLR: an Adaptive Scheduler for Learning Rate

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 3|浏览38
暂无评分
摘要
Training a neural network is a complicated and time-consuming task that involves adjusting and testing different combinations of hyperparameters. One of the essential hyperparameters is the learning rate, which balances the magnitude of changes at each training step. We introduce an Adaptive Scheduler for Learning Rate (ASLR) that significantly lowers the tuning effort since it only has a single hyperparameter. ASLR produces competitive results compared to the state-of-the-art for both hand-optimized learning rate schedulers and line search methods while requiring significantly less tuning effort. Our algorithm's computational cost is trivial and can be used to train various network topologies included quantized networks.
更多
查看译文
关键词
ASLR,adaptive scheduler,neural network training,time-consuming task,single hyperparameter,hand-optimized learning rate schedulers,line search methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要