Loading…

Cross-domain Few-shot Hyperspectral Image Classification With Class-wise Attention

Few-shot learning (FSL) is an effective method to solve the problem of hyperspectral image classification with few labeled samples. It learns transferable knowledge from sufficient labeled auxiliary data to classify unseen classes with limited labeled samples for training. However, the distribution...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1
Main Authors: Wang, Wenzhen, Liu, Fang, Liu, Jia, Xiao, Liang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Few-shot learning (FSL) is an effective method to solve the problem of hyperspectral image classification with few labeled samples. It learns transferable knowledge from sufficient labeled auxiliary data to classify unseen classes with limited labeled samples for training. However, the distribution difference between auxiliary data and unseen classes results in the learned transferable knowledge not being well applied to the new task. Therefore, a class-wise attentive cross-domain few-shot learning (CA-CFSL) framework is proposed in this paper, in which a feature extractor is learned to extract data features with discriminability and domain invariance. The class-wise attention metric module (CAMM) introduces a class-wise attention on the FSL framework to learn more discriminative features, which improves the inter-class decision boundaries. Furthermore, an asymmetric domain adversarial module (ADAM) is designed to enhance the ability of extracting domain invariant representations, which combines asymmetric adversarial training with embedded domain-specific information. Experimental results on four public hyperspectral image datasets demonstrate that the proposed method outperforms existing methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3239411