Loading…

Hybrid representation based on dictionaries for hyperspectral image classification

In real-world applications, the hyperspectral image (HSI) classification faces two critical problems: one is the Hughes phenomenon and the other is the large spatial and spectral variabilities among the pixels of the HSI, especially for the case where only few labeled pixels per class are available....

Full description

Saved in:
Bibliographic Details
Published in:Journal of applied remote sensing 2022-07, Vol.16 (3), p.036514-036514
Main Authors: Song, Fu-Xin, Deng, Shi-Wen
Format: Article
Language:English
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In real-world applications, the hyperspectral image (HSI) classification faces two critical problems: one is the Hughes phenomenon and the other is the large spatial and spectral variabilities among the pixels of the HSI, especially for the case where only few labeled pixels per class are available. To address these problems, we propose a united framework for the hybrid representation model based on two dictionaries in this paper, where the input pixel (or its feature) is explicitly decomposed into the class-specific component and the variation component via the Bayesian approach. Based on the proposed framework, two hybrid representation-based classification approaches, named as the sparse and dense hybrid representation-based classification and the hybrid sparse representation-based classification, are proposed based on different priors on the separate components over the associated dictionaries. Moreover, the techniques for designing the associated dictionaries and estimating the hyperparameters are also present so the practicality of the proposed approaches is further improved. Experimental results show that the proposed two approaches outperform the conventional classifiers and are robust to the spatial and spectral variabilities, based on four real hyperspectral datasets.
ISSN:1931-3195
1931-3195
DOI:10.1117/1.JRS.16.036514