Loading…

Dual-Branch Deep Convolution Neural Network for Polarimetric SAR Image Classification

The deep convolution neural network (CNN), which has prominent advantages in feature learning, can learn and extract features from data automatically. Existing polarimetric synthetic aperture radar (PolSAR) image classification methods based on the CNN only consider the polarization information of t...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences 2017-04, Vol.7 (5), p.447
Main Authors: Gao, Fei, Huang, Teng, Wang, Jun, Sun, Jinping, Hussain, Amir, Yang, Erfu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The deep convolution neural network (CNN), which has prominent advantages in feature learning, can learn and extract features from data automatically. Existing polarimetric synthetic aperture radar (PolSAR) image classification methods based on the CNN only consider the polarization information of the image, instead of incorporating the image’s spatial information. In this paper, a novel method based on a dual-branch deep convolution neural network (Dual-CNN) is proposed to realize the classification of PolSAR images. The proposed method is built on two deep CNNs: one is used to extract the polarization features from the 6-channel real matrix (6Ch) which is derived from the complex coherency matrix. The other is utilized to extract the spatial features of a Pauli RGB (Red Green Blue) image. These extracted features are first combined into a fully connected layer sharing the polarization and spatial property. Then, the Softmax classifier is employed to classify these features. The experiments are conducted on the Airborne Synthetic Aperture Radar (AIRSAR) data of Flevoland and the results show that the classification accuracy on 14 types of land cover is up to 98.56%. Such results are promising in comparison with other state-of-the-art methods.
ISSN:2076-3417
2076-3417
DOI:10.3390/app7050447