Loading…

Context-Based Oriented Object Detector for Small Objects in Remote Sensing Imagery

Object detection in remote sensing imagery is a challenging task in the field of computer vision and has high research value. To improve the classification accuracy and positioning accuracy of object detection, we propose a new multi-scale oriented object detector suitable for small objects. Firstly...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2022, Vol.10, p.100526-100539
Main Authors: Jiang, Qunyan, Dai, Juying, Rui, Ting, Shao, Faming, Lu, Guanlin, Wang, Jinkang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Object detection in remote sensing imagery is a challenging task in the field of computer vision and has high research value. To improve the classification accuracy and positioning accuracy of object detection, we propose a new multi-scale oriented object detector suitable for small objects. Firstly, the feature fusion network based on information balance (IBFF) is proposed to reduce the reuse of different layers' features from the backbone network and reduce the interference of redundant information based on the premise that the output features have sufficient information, and retain enough shallow detail information. Secondly, to efficiently utilize deep and shallow features, enhance important features, and reduce background noise interference, different attention-based context feature fusion modules (DACFF) are designed according to the characteristics of different feature fusion stages. Finally, an improved strategy of oriented bounding box regression is proposed to obtain the oriented bounding box with a simpler and more effective strategy. The proposed method was evaluated on two public remote sensing datasets, DOTA and HRSC2016, and their mAP values are 80.96% and 95.01%, respectively, which verified the effectiveness of the proposed algorithm.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3204622