Loading…

HSF-Net: Multiscale Deep Feature Embedding for Ship Detection in Optical Remote Sensing Imagery

Ship detection is an important and challenging task in remote sensing applications. Most methods utilize specially designed hand-crafted features to detect ships, and they usually work well only on one scale, which lack generalization and impractical to identify ships with various scales from multir...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2018-12, Vol.56 (12), p.7147-7161
Main Authors: Li, Qingpeng, Mou, Lichao, Liu, Qingjie, Wang, Yunhong, Zhu, Xiao Xiang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ship detection is an important and challenging task in remote sensing applications. Most methods utilize specially designed hand-crafted features to detect ships, and they usually work well only on one scale, which lack generalization and impractical to identify ships with various scales from multiresolution images. In this paper, we propose a novel deep feature-based method to detect ships in very high-resolution optical remote sensing images. In our method, a regional proposal network is used to generate ship candidates from feature maps produced by a deep convolutional neural network. To efficiently detect ships with various scales, a hierarchical selective filtering layer is proposed to map features in different scales to the same scale space. The proposed method is an end-to-end network that can detect both inshore and offshore ships ranging from dozens of pixels to thousands. We test our network on a large ship data set which will be released in the future, consisting of Google Earth images, GaoFen-2 images, and unmanned aerial vehicle data. Experiments demonstrate high precision and robustness of our method. Further experiments on aerial images show its good generalization to unseen scenes.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2018.2848901