A Multi-Attention Feature Distillation Neural Network for Lightweight Single Image Super-Resolution

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS(2024)

引用 0|浏览17
暂无评分
摘要
In recent years, remarkable performance improvements have been produced by deep convolutional neural networks (CNN) for single image super-resolution (SISR). Nevertheless, a high proportion of CNN-based SISR models are with quite a few network parameters and high computational complexity for deep or wide architectures. How to more fully utilize deep features to make a balance between model complexity and reconstruction performance is one of the main challenges in this field. To address this problem, on the basis of the well-known information multi-distillation model, a multi-attention feature distillation network termed as MAFDN is developed for lightweight and accurate SISR. Specifically, an effective multi-attention feature distillation block (MAFDB) is designed and used as the basic feature extraction unit in MAFDN. With the help of multi-attention layers including pixel attention, spatial attention, and channel attention, MAFDB uses multiple information distillation branches to learn more discriminative and representative features. Furthermore, MAFDB introduces the depthwise over-parameterized convolutional layer (DO-Conv)-based residual block (OPCRB) to enhance its ability without incurring any parameter and computation increase in the inference stage. The results on commonly used datasets demonstrate that our MAFDN outperforms existing representative lightweight SISR models when taking both reconstruction performance and model complexity into consideration. For example, for x4 SR on Set5, MAFDN (597K/33.79G) obtains 0.21 dB/0.0037 and 0.10 dB/0.0015 PSNR/SSIM gains over the attention-based SR model AFAN (692K/50.90G) and the feature distillation-based SR model DDistill-SR (675K/32.83G), respectively.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要