Loading…
Arbitrary-Oriented Object Detection in Remote Sensing Images Based on Polar Coordinates
Arbitrary-oriented object detection is an important task in the field of remote sensing object detection. Existing studies have shown that the polar coordinate system has obvious advantages in dealing with the problem of rotating object modeling, that is, using fewer parameters to achieve more accur...
Saved in:
Published in: | IEEE access 2020, Vol.8, p.223373-223384 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Arbitrary-oriented object detection is an important task in the field of remote sensing object detection. Existing studies have shown that the polar coordinate system has obvious advantages in dealing with the problem of rotating object modeling, that is, using fewer parameters to achieve more accurate rotating object detection. However, present state-of-the-art detectors based on deep learning are all modeled in Cartesian coordinates. In this article, we introduce the polar coordinate system to the deep learning detector for the first time, and propose an anchor free Polar Remote Sensing Object Detector (P-RSDet), which can achieve competitive detection accuracy via using simpler object representation model and less regression parameters. In P-RSDet method, arbitrary-oriented object detection can be achieved by predicting the center point and regressing one polar radius and two polar angles. Besides, in order to express the geometric constraint relationship between the polar radius and the polar angle, a Polar Ring Area Loss function is proposed to improve the prediction accuracy of the corner position. Experiments on DOTA, UCAS-AOD and NWPU VHR-10 datasets show that our P-RSDet achieves state-of-the-art performances with simpler model and less regression parameters. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.3041025 |