Towards Accurate Network Quantization with Equivalent Smooth Regularizer.

European Conference on Computer Vision(2022)

引用 0|浏览11
暂无评分
摘要
Neural network quantization techniques have been a prevailing way to reduce the inference time and storage cost of full-precision models for mobile devices. However, they still suffer from accuracy degradation due to inappropriate gradients in the optimization phase, especially for low-bit precision network and low-level vision tasks. To alleviate this issue, this paper defines a family of equivalent smooth regularizers for neural network quantization, named as SQR, which represents the equivalent of actual quantization error. Based on the definition, we propose a novel QSin regularizer as an instance to evaluate the performance of SQR, and also build up an algorithm to train the network for integer weight and activation. Extensive experimental results on classification and SR tasks reveal that the proposed method achieves higher accuracy than other prominent quantization approaches. Especially for SR task, our method alleviates the plaid artifacts effectively for quantized networks in terms of visual quality.
更多
查看译文
关键词
Network quantization,Smooth regularizer,Equivalence,Gradient,Low-level vision task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要