Loading…
CoF-Net: A Progressive Coarse-to-Fine Framework for Object Detection in Remote-Sensing Imagery
Object detection in remote-sensing images is a crucial task in the fields of Earth observation and computer vision. Despite impressive progress in modern remote-sensing object detectors, there are still three challenges to overcome: 1) complex background interference, 2) dense and cluttered arrangem...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Object detection in remote-sensing images is a crucial task in the fields of Earth observation and computer vision. Despite impressive progress in modern remote-sensing object detectors, there are still three challenges to overcome: 1) complex background interference, 2) dense and cluttered arrangement of instances, and 3) large scale variations. These challenges lead to two key deficiencies, namely coarse features and coarse samples, which limit the performance of existing object detectors. To address these issues, in this paper, a novel coarse-to-fine framework (CoF-Net) is proposed for object detection in remote-sensing imagery. CoF-Net mainly consists of two parallel branches, namely coarse-to-fine feature adaptation (CoF-FA) and coarse-to-fine sample assignment (CoF-SA), which aim to progressively enhance feature representation and select stronger training samples, respectively. Specifically, CoF-FA smoothly refines the original coarse features into multi-spectral nonlocal fine features with discriminative spatial-spectral details and semantic relations. Meanwhile, CoF-SA dynamically considers samples from coarse to fine by progressively introducing geometric and classification constraints for sample assignment during training. Comprehensive experiments on three public datasets demonstrate the effectiveness and superiority of the proposed method. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2022.3233881 |