Loading…

Pan-Sharpening Based on Transformer With Redundancy Reduction

Pan-sharpening methods based on deep neural network (DNN) have produced the state-of-the-art results. However, the common information in the panchromatic (PAN) image and the low spatial resolution multispectral (LRMS) image is not sufficiently explored. As PAN and LRMS images are collected from the...

Full description

Saved in:
Bibliographic Details
Published in:IEEE geoscience and remote sensing letters 2022, Vol.19, p.1-5
Main Authors: Zhang, Kai, Li, Zhuolin, Zhang, Feng, Wan, Wenbo, Sun, Jiande
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Pan-sharpening methods based on deep neural network (DNN) have produced the state-of-the-art results. However, the common information in the panchromatic (PAN) image and the low spatial resolution multispectral (LRMS) image is not sufficiently explored. As PAN and LRMS images are collected from the same scene, there exists some common information among them, in addition to their respective unique information. The direct concatenation of extracted features leads to some redundancy in the feature space. To reduce the redundancy among features and exploit the global information in source images, we proposed a novel pan-sharpening method by combining the convolution neural network and transformer. Specifically, PAN and LRMS images are encoded as unique features and common features by the subnetworks consisting of convolution blocks and transformer blocks. Then, the common features are averaged and combined with unique features from source images for the reconstruction of the fused image. To extract accurate common features, the equality constraint is imposed on them. Experimental results show that the proposed method outperforms the state-of-the-art methods on both reduced-scale and full-scale datasets. The source code is available at https://github.com/RSMagneto/TRRNet .
ISSN:1545-598X
1558-0571
DOI:10.1109/LGRS.2022.3186985