Loading…

Multibranch Feature Extraction and Feature Multiplexing Network for Pansharpening

With the continuous development of deep neural networks in the visual field, their application to panchromatic sharpening has received increasing attention from researchers; however, the existing panchromatic sharpening methods generally lack the ability to combine knowledge of the panchromatic shar...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-13
Main Authors: Lei, Dajiang, Huang, Yihang, Zhang, Liping, Li, Weisheng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the continuous development of deep neural networks in the visual field, their application to panchromatic sharpening has received increasing attention from researchers; however, the existing panchromatic sharpening methods generally lack the ability to combine knowledge of the panchromatic sharpening field for feature extraction with neural networks, which have certain limitations in feature extraction and the discovery of new features. This article proposes a simple, modular, multibranched feature extraction and reuses network architecture designed not only to support feature reuse but to learn well-expressed new features for use in panchromatic sharpening approaches. In addition, we fused field knowledge of panchromatic sharpening to extract spatial structure information of panchromatic maps through gradient calculators and design structural and spectral compensation to fully extract and preserve the spatial structural and spectral information of images. We conducted experiments on the QuickBird and WorldView-3 satellite data sets, and the experimental results reveal that our proposed method has advantages over the best methods currently available, achieving excellent results not only on objective evaluation metrics, such as full-reference and no-reference metrics, but also on subjective visual evaluation.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3074624