Adapprox: Adaptive Approximation in Adam Optimization via Randomized Low-Rank Matrices
arxiv(2024)
摘要
As deep learning models exponentially increase in size, optimizers such as
Adam encounter significant memory consumption challenges due to the storage of
first and second moment data. Current memory-efficient methods like Adafactor
and CAME often compromise accuracy with their matrix factorization techniques.
Addressing this, we introduce Adapprox, a novel approach that employs
randomized low-rank matrix approximation for a more effective and accurate
approximation of Adam's second moment. Adapprox features an adaptive rank
selection mechanism, finely balancing accuracy and memory efficiency, and
includes an optional cosine similarity guidance strategy to enhance stability
and expedite convergence. In GPT-2 training and downstream tasks, Adapprox
surpasses AdamW by achieving 34.5
for the 117M and 345M models, respectively, with the first moment enabled, and
further increases these savings without the first moment. Besides, it enhances
convergence speed and improves downstream task performance relative to its
counterparts.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要