Loading…
Remote sensing image recovery via enhanced residual learning and dual-luminance scheme
Low-quality (LQ) images will seriously affect the performance of the information processing system in the field of remote sensing. To recover a high-quality (HQ) remote sensing image from the LQ version, the remote sensing image recovery (RSIR) methods are widely studied. In this paper, we propose a...
Saved in:
Published in: | Knowledge-based systems 2021-06, Vol.222, p.107013, Article 107013 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Low-quality (LQ) images will seriously affect the performance of the information processing system in the field of remote sensing. To recover a high-quality (HQ) remote sensing image from the LQ version, the remote sensing image recovery (RSIR) methods are widely studied. In this paper, we propose a novel enhanced residual convolutional neural network (ERCNN) with dual-luminance scheme (DLS) for RSIR. Our network mainly focuses on learning the residual features to better deal with the high-frequency recovery. Specifically, ERCNN is mainly constructed by enhanced residual groups (ERGs), and ERG is further constructed by enhanced residual blocks (ERBs). Two strategies are used in ERB: proposing an enhanced feature flow module for improving the flowability of feature information with a reasonable parameter number; utilizing the feature attention module to enhance the ability of distinguish learning across feature maps. Furthermore, we introduce two kinds of reversible transformation layers into our network for a larger receptive field and a lower memory burden. Moreover, we propose DLS to further boost the RSIR ability of ERCNN, leading to the boosting version BERCNN. Experimental results on two typical RSIR problems demonstrate the superiority of our method over other RSIR methods. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2021.107013 |