Loading…
3-D HANet: A Flexible 3-D Heatmap Auxiliary Network for Object Detection
3-D object detection is a vital part of outdoor scene perception. Learning the complete size and accurate positioning of objects from an incomplete point cloud spatial structure is essential to 3-D object detection. We propose a novel flexible 3-D heatmap auxiliary network (3-D HANet) for object det...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2023, Vol.61, p.1-13 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | 3-D object detection is a vital part of outdoor scene perception. Learning the complete size and accurate positioning of objects from an incomplete point cloud spatial structure is essential to 3-D object detection. We propose a novel flexible 3-D heatmap auxiliary network (3-D HANet) for object detection. To obtain complete structure and location information from an incomplete point cloud structure, we propose a 3-D heatmap to reflect object information. Also, we design a plug-and-play auxiliary network based on 3-D heatmap, which improves the accuracy of the entire detection network without extra computation in the inference stage. We validate the 3-D HANet on the basis of three classic 3-D object detection networks: PointPillars, sparsely embedded convolutional detection (SECOND), and structure aware single-stage 3-D object detection from point cloud (SASSD). Experimental results show that our auxiliary network augments the feature extraction ability of the backbone network, which is manifested in that the predicted boxes and the ground-truth boxes are more suitable in size and more aligned in direction. Furthermore, we conducted verification experiments on the state-of-the-art (SOTA) detector, CasA, and made a further improvement on the official ranking of the KITTI dataset. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2023.3250229 |