Asymmetric Feature Fusion Network for Hyperspectral and SAR Image Classification

IEEE transactions on neural networks and learning systems(2023)

引用 47|浏览68
暂无评分
摘要
Joint classification using multisource remote sensing data for Earth observation is promising but challenging. Due to the gap of imaging mechanism and imbalanced information between multisource data, integrating the complementary merits for interpretation is still full of difficulties. In this article, a classification method based on asymmetric feature fusion, named asymmetric feature fusion network (AsyFFNet), is proposed. First, the weight-share residual blocks are utilized for feature extraction while keeping separate batch normalization (BN) layers. In the training phase, redundancy of the current channel is self-determined by the scaling factors in BN, which is replaced by another channel when the scaling factor is less than a threshold. To eliminate unnecessary channels and improve the generalization, a sparse constraint is imposed on partial scaling factors. Besides, a feature calibration module is designed to exploit the spatial dependence of multisource features, so that the discrimination capability is enhanced. Experimental results on the three datasets demonstrate that the proposed AsyFFNet significantly outperforms other competitive approaches.
更多
查看译文
关键词
Feature extraction,Radar polarimetry,Synthetic aperture radar,Calibration,Laser radar,Redundancy,Convolutional neural networks,Asymmetric feature fusion network (AsyFFNet),feature calibration,multisource remote sensing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要