PLFM: Pixel-Level Merging of Intermediate Feature Maps by Disentangling and Fusing Spatial and Temporal Data for Cloud Removal

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING(2022)

Cited 9|Views16
No score
Abstract
Cloud removal is a relevant topic in remote sensing, fostering medium- and high-resolution optical (OPT) image usability for Earth monitoring and study. Recent applications of deep generative models and sequence-to-sequence-based models have proved their capability to advance the field significantly. Nevertheless, there are still some gaps: the amount of cloud coverage, the landscape temporal changes, and the density and thickness of clouds need further investigation. We fill some of these gaps in this work by introducing an innovative deep model. The proposed model is multimodal, relying on both spatial and temporal sources of information to restore the whole optical scene of interest. We use the outcomes of both temporal-sequence blending and direct translation from synthetic aperture radar (SAR) to optical images to obtain a pixel-wise restoration of the whole scene. The reconstructed images preserve scene details without resorting to a considerable portion of a clean image. Our approach's advantage is demonstrated across various atmospheric conditions tested on different datasets. Quantitative and qualitative results prove that the proposed method obtains cloud-free images coping with landscape changes.
More
Translated text
Key words
Clouds,Optical imaging,Optical sensors,Image reconstruction,Radar polarimetry,Adaptation models,Image restoration,Cloud removal (CR),conditional generative adversarial networks (cGANs),convolutional long short-term memory (ConvLSTM),deep hierarchical model,multitemporal remote sensing (RS) images,synthetic aperture radar (SAR)-optical (OPT) data fusion
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined