Loading…

Supervised Multi-scale Attention-guided Ship Detection in Optical Remote Sensing Images

Ship detection in optical remote sensing images plays a significant role in a wide range of civilian and military tasks. However, it is still a challenging issue owing to complex environmental interferences and a large variety of target scales and positions. To overcome these limitations, we propose...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-1
Main Authors: Hu, Jianming, Zhi, Xiyang, Jiang, Shikai, Tang, Hao, Zhang, Wei, Bruzzone, Lorenzo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ship detection in optical remote sensing images plays a significant role in a wide range of civilian and military tasks. However, it is still a challenging issue owing to complex environmental interferences and a large variety of target scales and positions. To overcome these limitations, we propose a supervised multi-scale attention-guided detection framework, which can effectively detect ships of different scales both in complex pure ocean and port scenes. Specifically, a multi-scale supervision module is first proposed to adjust the semantic consistency of different feature levels, obtaining extracted features with small semantic gaps. Next, an attention-guided module is utilized to aggregate context information from both spatial and channel dimensions by calculating map correlations, adaptively enhancing the feature representation. Moreover, to preserve the attribute and spatial relationship of the optimized features, we adopt a capsule-based module as the classifier and obtain satisfactory classification performance. Experimental results conducted on two public high-quality datasets demonstrate that the proposed method obtains state-of-the-art performance in comparison with several advanced methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3206306