Loading…

Hyperspectral Image Classification via Basic Thresholding Classifier

We propose a lightweight sparsity-based algorithm, namely, the basic thresholding classifier (BTC), for hyperspectral image (HSI) classification. BTC is a pixelwise classifier which uses only the spectral features of a given test pixel. It performs the classification using a predetermined dictionary...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2016-07, Vol.54 (7), p.4039-4051
Main Authors: Toksoz, Mehmet Altan, Ulusoy, Ilkay
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a lightweight sparsity-based algorithm, namely, the basic thresholding classifier (BTC), for hyperspectral image (HSI) classification. BTC is a pixelwise classifier which uses only the spectral features of a given test pixel. It performs the classification using a predetermined dictionary consisting of labeled training pixels. It then produces the class label and residual vector of the test pixel. Since incorporating spatial and spectral information in HSI classification is quite an effective way of improving classification accuracy, we extend our proposal to a three-step spatial-spectral framework. First, every pixel of a given HSI is classified using BTC. The resulting residual vectors form a cube which could be interpreted as a stack of images representing residual maps. Second, each residual map is filtered using an averaging filter. Finally, the class label of each test pixel is determined based on minimal residual. Numerical results on public data sets show that our proposal outperforms well-known support vector machine-based techniques and sparsity-based greedy approaches like simultaneous orthogonal matching pursuit in terms of both classification accuracy and computational cost.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2016.2535458