Down-scale Simplified Non-local Attention Networks with Application to Image Denoising

Research Square (Research Square)(2023)

引用 0|浏览0
暂无评分
摘要
Abstract Recently, non-local (NL) attention modules or transformer-based methods have been successfully applied in various image processing tasks. The success of the techniques is owed to the fully utilization of the long-range similarity of in-depth features. However, the computation of the long-range similarity is very expensive, which greatly limits the widely application of the NL module in image processing. Various improved methods have been proposed to reduce the computational complexity. Nevertheless, the acceleration of the computation of the NL attention is based on the reduction of the range of exploring the feature-wise similarity, which may limit the ability to model the long-range relations and cause the degradation of performance. In this paper, inspired by the down-scale techniques and the recurrence law of image patches across different scales, we propose an efficient down-scale simplified NL (DSNL) attention module. By utilizing the down-scale techniques we divide the deep feature maps into several feature maps in the coarse scales, which contain the cleaner version of feature patches in the original feature maps. Our method obviously reduces the computation of the NL attention, and meanwhile the range of the NL attention is unchanged. The proposed DSNL attention module can be flexibly integrated into various convolutional networks. Numerical experiments on image denoising demonstrate that the proposed attention module yields better performance than the original NL attention modules with significantly less computational time, and the corresponding networks can product favorable results compared to many state-of-the-art methods.
更多
查看译文
关键词
image denoising,attention,networks,down-scale,non-local
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要