Loading…

A Refined Single-Stage Detector With Feature Enhancement and Alignment for Oriented Objects

Compared to traditional object detection of horizontal bounding box, detecting rotated objects with arbitrary orientations and various scales is a critical yet challenging task especially in remote sensing images. Although considerable progresses have been made through the utilization of deep CNN, t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE journal of selected topics in applied earth observations and remote sensing 2021, Vol.14, p.8898-8908
Main Authors: Chen, Si-Bao, Dai, Bei-Min, Tang, Jin, Luo, Bin, Wang, Wei-Qiang, Lv, Ke
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Compared to traditional object detection of horizontal bounding box, detecting rotated objects with arbitrary orientations and various scales is a critical yet challenging task especially in remote sensing images. Although considerable progresses have been made through the utilization of deep CNN, there still exists space for exploration of oriented object detection. In this article, we propose a refined single-stage detector for oriented objects, which is equipped with the enhanced feature extraction network and an adaptive feature alignment module for finer detection. For feature enhancement, a bidirectional and inner residual feature pyramid network and a multiscale feature aggregation module are devised for getting more representative features. Then, to address the problem of feature misalignment, we propose the adaptive feature alignment module to reallocate the sampling locations and weight the positive and negative feature points during the refined detection process. The reallocation and weighting will be adjusted adaptively according to the results from the coarse detection. Experiments conducted on public remote sensing databases show the effectiveness of our method.
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2021.3107549