Loading…
Deep Fully Convolutional Network-Based Spatial Distribution Prediction for Hyperspectral Image Classification
Most of the existing spatial-spectral-based hyperspectral image classification (HSIC) methods mainly extract the spatial-spectral information by combining the pixels in a small neighborhood or aggregating the statistical and morphological characteristics. However, those strategies can only generate...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2017-10, Vol.55 (10), p.5585-5599 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Most of the existing spatial-spectral-based hyperspectral image classification (HSIC) methods mainly extract the spatial-spectral information by combining the pixels in a small neighborhood or aggregating the statistical and morphological characteristics. However, those strategies can only generate shallow appearance features with limited representative ability for classes with high interclass similarity and spatial diversity and therefore reduce the classification accuracy. To this end, we present a novel HSIC framework, named deep multiscale spatial-spectral feature extraction algorithm, which focuses on learning effective discriminant features for HSIC. First, the well pretrained deep fully convolutional network based on VGG-verydeep-16 is introduced to excavate the potential deep multiscale spatial structural information in the proposed hyperspectral imaging framework. Then, the spectral feature and the deep multiscale spatial feature are fused by adopting the weighted fusion method. Finally, the fusion feature is put into a generic classifier to obtain the pixelwise classification. Compared with the existing spectral-spatial-based classification techniques, the proposed method provides the state-of-the-art performance and is much more effective, especially for images with high nonlinear distribution and spatial diversity. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2017.2710079 |