Loading…

Semi-Supervised Remote-Sensing Image Scene Classification Using Representation Consistency Siamese Network

Deep learning has achieved excellent performance in remote-sensing image scene classification, since a large number of datasets with annotations can be applied for training. However, in actual applications, there is just a few annotated samples and a large number of unannotated samples in remote-sen...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-14
Main Authors: Miao, Wang, Geng, Jie, Jiang, Wen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep learning has achieved excellent performance in remote-sensing image scene classification, since a large number of datasets with annotations can be applied for training. However, in actual applications, there is just a few annotated samples and a large number of unannotated samples in remote-sensing images, which leads to overfitting of the deep model and affects the performance of scene classification. In order to address these problems, a semi-supervised representation consistency Siamese network (SS-RCSN) is proposed for remote-sensing image scene classification. First, considering intraclass diversity and interclass similarity of remote-sensing images, Involution-generative adversarial network (GAN) is utilized to extract the discriminative features from remote-sensing images via unsupervised learning. Then, Siamese network with a representation consistency loss is proposed for semi-supervised classification, which aims to reduce the differences of labeled and unlabeled data. Experimental results on UC Merced dataset, RESICS-45 dataset, aerial image dataset (AID), and RS dataset demonstrate that our method yields superior classification performance compared with other semi-supervised learning (SSL) methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3140485