Chrome Extension
WeChat Mini Program
Use on ChatGLM

Multi-Scale Feature Attention and Transformer for Hyperspectral Image Classification.

Workshop on Hyperspectral Image and Signal Processing(2023)

Cited 0|Views4
No score
Abstract
In hyperspectral image (HSI) classification, convolutional neural networks (CNNs) have shown great potential, but they often overlook multi-scale information and the relationship between features at different scales. To address these problems, HSI classification based on multi-scale feature attention and transformer (MSFAT) is proposed in this paper. Specifically, the proposed MSFAT first extracts multi-scale features by using convolutional kernels of different sizes. Then, the squeeze-and-excitation (SE) module is used to get the attention weight of features at each scale. Next, a simple but effective cross-scale attention module is used to enhance informative features at different scales. Furthermore, to better extract more discriminative features, a transformer encoder is incorporated to capture long-range dependencies between features at different scales. According to experimental results on two common hyperspectral scenes, our proposed MSFAT has demonstrated favorable classification performance when compared with several advanced methods.
More
Translated text
Key words
Hyperspectral image (HSI) classification,multi-scale feature extraction,attention,transformer
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined