Loading…

PSRT: Pyramid Shuffle-and-Reshuffle Transformer for Multispectral and Hyperspectral Image Fusion

A Transformer has received a lot of attention in computer vision. Because of global self-attention, the computational complexity of Transformer is quadratic with the number of tokens, leading to limitations for practical applications. Hence, the computational complexity issue can be efficiently reso...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2023, Vol.61, p.1-15
Main Authors: Deng, Shang-Qi, Deng, Liang-Jian, Wu, Xiao, Ran, Ran, Hong, Danfeng, Vivone, Gemine
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A Transformer has received a lot of attention in computer vision. Because of global self-attention, the computational complexity of Transformer is quadratic with the number of tokens, leading to limitations for practical applications. Hence, the computational complexity issue can be efficiently resolved by computing the self-attention in groups of smaller fixed-size windows. In this article, we propose a novel pyramid Shuffle-and-Reshuffle Transformer (PSRT) for the task of multispectral and hyperspectral image fusion (MHIF). Considering the strong correlation among different patches in remote sensing images and complementary information among patches with high similarity, we design Shuffle-and-Reshuffle (SaR) modules to consider the information interaction among global patches in an efficient manner. Besides, using pyramid structures based on window self-attention, the detail extraction is supported. Extensive experiments on four widely used benchmark datasets demonstrate the superiority of the proposed PSRT with a few parameters compared with several state-of-the-art approaches. The related code is available at https://github.com/Deng-shangqi/PSRT .
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3244750