Parallel Fusion Neural Network Considering Local and Global Semantic Information for Citrus Tree Canopy Segmentation

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING(2024)

Cited 0|Views10
No score
Abstract
Existing convolutional neural network (CNN) based methods usually tend to ignore the contextual information for citrus tree canopy segmentation. Although popular transformer models are helpful in extracting global semantic information, they ignore the edge details between citrus tree canopies and the background. To address these issues, we propose a parallel fusion neural network considering both local and global semantic information for citrus tree canopy segmentation from 3-D data, which are derived by unmanned aerial vehicle (UAV) mapping. In the feature extraction stage, a parallel architecture, concatenated by EfficientNet-V2 and CSwin transformer, is used to extract local and global information of citrus trees. In the feature fusion stage, we design a coordinate attention-based fusion module to retain the contextual information and local edge details of citrus tree canopies. Additionally, to exaggerate the exclusivity between tree canopies and complex backgrounds, 3-D data incorporating RGB imagery and canopy height model derived by UAV photogrammetry are generated for citrus tree canopy segmentation. Experimental results indicate that the proposed method performs considerably better than methods based only on CNN or transformer models and is superior to state-of-the-art methods (e.g., the highest mIoU score of 93.46%).
More
Translated text
Key words
Citrus tree canopy,complex background,contextual information,self-attention mechanism,semantic segmentation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined