How to Parameterize Asymmetric Quantization Ranges for Quantization-Aware Training
arxiv(2024)
摘要
This paper investigates three different parameterizations of asymmetric
uniform quantization for quantization-aware training: (1) scale and offset, (2)
minimum and maximum, and (3) beta and gamma. We perform a comprehensive
comparative analysis of these parameterizations' influence on
quantization-aware training, using both controlled experiments and real-world
large language models. Our particular focus is on their changing behavior in
response to critical training hyperparameters, bit width and learning rate.
Based on our investigation, we propose best practices to stabilize and
accelerate quantization-aware training with learnable asymmetric quantization
ranges.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要