Loading…

Semi-Supervised anchor graph ensemble for large-scale hyperspectral image classification

Existing graph-based, semi-supervised hyperspectral image (HSI) classification models often suffer from prolonged execution time due to high computational complexity. In this work, we first propose a fast anchor graph regularization (FAGR) model for large scale, HSI classification. FAGR employs a si...

Full description

Saved in:
Bibliographic Details
Published in:International journal of remote sensing 2022-03, Vol.43 (5), p.1894-1918
Main Authors: He, Ziping, Xia, Kewen, Hu, Yuhen, Yin, Zhixian, Wang, Sijie, Zhang, Jiangnan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Existing graph-based, semi-supervised hyperspectral image (HSI) classification models often suffer from prolonged execution time due to high computational complexity. In this work, we first propose a fast anchor graph regularization (FAGR) model for large scale, HSI classification. FAGR employs a simple anchor-based graph construction procedure and a new adjacency matrix among anchors to dramatically reduce the computational complexity while attaining good classification performance. In order to further improve the classification accuracy of hyperspectral images, we propose a novel semi-supervised anchor graph ensemble (SAGE) model. SAGE is an ensemble realization of multiple FAGR with each component FAGR operating on a randomly selected subset of features. Ameta classifier is applied to aggregate the outputs of component classifiers to yield an ensemble classification result. We performed extensive experimentations using three real-world HSI datasets, to compare the performance of FAGR and SAGE against several existing graph-based HSI classifiers. The experiment results show that the proposed SAGE achieves 95.78% classification accuracy on the Indian Pines dataset using limited labeled samples, out-performing existing models in terms of shorter execution time and better classification accuracy.
ISSN:0143-1161
1366-5901
DOI:10.1080/01431161.2022.2048916