谷歌浏览器插件
订阅小程序
在清言上使用

DAQ: Channel-Wise Distribution-Aware Quantization for Deep Image Super-Resolution Networks

2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)(2022)

引用 9|浏览7
暂无评分
摘要
Since the resurgence of deep neural networks (DNNs), image super-resolution (SR) has recently seen a huge progress in improving the quality of low resolution images, however at the great cost of computations and resources. Recently, there has been several efforts to make DNNs more efficient via quantization. However, SR demands pixel-level accuracy in the system, it is more difficult to perform quantization without significantly sacrificing SR performance. To this end, we introduce a new ultra-low precision yet effective quantization approach specifically designed for SR. In particular, we observe that in recent SR networks, each channel has different distribution characteristics. Thus we propose a channel-wise distribution-aware quantization scheme. Experimental results demonstrate that our proposed quantization, dubbed Distribution-Aware Quantization (DAQ), manages to greatly reduce the computational and resource costs without the significant sacrifice in SR performance, compared to other quantization methods.
更多
查看译文
关键词
Image Processing -> Image Restoration Deep Learning -> Efficient Training and Inference Methods for Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要