Loading…

A Crop Classification Method Integrating GF-3 PolSAR and Sentinel-2A Optical Data in the Dongting Lake Basin

With the increasing of satellite sensors, more available multi-source data can be used for large-scale high-precision crop classification. Both polarimetric synthetic aperture radar (PolSAR) and multi-spectral optical data have been widely used for classification. However, it is difficult to combine...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2018-09, Vol.18 (9), p.3139
Main Authors: Gao, Han, Wang, Changcheng, Wang, Guanya, Zhu, Jianjun, Tang, Yuqi, Shen, Peng, Zhu, Ziwei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the increasing of satellite sensors, more available multi-source data can be used for large-scale high-precision crop classification. Both polarimetric synthetic aperture radar (PolSAR) and multi-spectral optical data have been widely used for classification. However, it is difficult to combine the covariance matrix of PolSAR data with the spectral bands of optical data. Using Hoekman's method, this study solves the above problems by transforming the covariance matrix to an intensity vector that includes multiple intensity values on different polarization basis. In order to reduce the features redundancy, the principal component analysis (PCA) algorithm is adopted to select some useful polarimetric and optical features. In this study, the PolSAR data acquired by satellite Gaofen-3 (GF-3) on 19 July 2017 and the optical data acquired by Sentinel-2A on 17 July 2017 over the Dongting lake basin are selected for the validation experiment. The results show that the full feature integration method proposed in this study achieves an overall classification accuracy of 85.27%, higher than that of the single dataset method or some other feature integration modes.
ISSN:1424-8220
1424-8220
DOI:10.3390/s18093139