Loading…
Object Detection in Remote Sensing Images With Parallel Feature Fusion and Cascade Global Attention Head
Convolutional neural networks (CNNs) have driven significant development in remote sensing (RS) object detection. To achieve concise and effective optimization, we propose a two-stage detector with a parallel feature fusion strategy and a cascade global attention (GA) mechanism for object detection...
Saved in:
Published in: | IEEE geoscience and remote sensing letters 2024, Vol.21, p.1-5 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Convolutional neural networks (CNNs) have driven significant development in remote sensing (RS) object detection. To achieve concise and effective optimization, we propose a two-stage detector with a parallel feature fusion strategy and a cascade global attention (GA) mechanism for object detection in RS images, named PC-RCNN. We first design a feature pyramid network with two parallel branches (PB-FPN), corresponding to the top-down and bottom-up feature fusion pathways, respectively. Different optimization modules can be adopted in different pathways to avoid potential module incompatibility when connected in series. Such parallel feature fusion strategy can achieve both higher detection accuracy and higher computational efficiency compared with previous series fusion modes. Furthermore, we design a GA block to enhance feature representations of regions and propose a cascade GA head network (CGA-Head) for accurate category prediction and location estimation. Experiments on a challenging large-scale dataset, namely DOTA, show that the proposed PC-RCNN achieves a mean average precision (mAP) of 77.63%, which is comparable to other state-of-the-art CNN-based models. Parallel feature fusion, cascade GA, object detection, RS images. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2024.3385231 |