Loading…
Group Bilinear CNNs for Dual-Polarized SAR Ship Classification
Ship classification from synthetic aperture radar (SAR) images tends to be a hotspot in the remote sensing community. Currently, more efforts have been made to the single-polarization (single-pol) SAR ship classification with limited performance. This letter proposes to explore the dual-polarization...
Saved in:
Published in: | IEEE geoscience and remote sensing letters 2022, Vol.19, p.1-5 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Ship classification from synthetic aperture radar (SAR) images tends to be a hotspot in the remote sensing community. Currently, more efforts have been made to the single-polarization (single-pol) SAR ship classification with limited performance. This letter proposes to explore the dual-polarization (dual-pol) SAR images for better ship classification. To be specific, a novel group bilinear convolutional neural network (GBCNN) model is developed to deeply extract discriminative second-order representations of ship targets from the pairwise vertical-horizontal polarization (VH) and vertical-vertical polarization (VV) SAR images. In particular, the deep bilinear features are efficiently acquired by performing the bilinear pooling on subgroups of deep feature maps derived from the single-pol SAR images (self-bilinear pooling) and dual-pol SAR images (cross-bilinear pooling). To fully explore the polarization information, the multipolarization fusion loss (MPFL) is constructed to train the proposed model for superior SAR ship representation learning. Through extensive experiments, the proposed method can achieve an overall accuracy (OA) of 88.80% and 66.90% on the three- and five-category dual-pol OpenSARShip datasets, which outperforms the state-of-the-art methods by at least 2.00% and 2.37%, respectively. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2022.3178080 |