Loading…
VP-Net: Voxels as Points for 3D Object Detection
3D object detection with LiDAR point clouds is a challenging problem which requires 3D scene understanding, yet this task is critical to autonomous driving. Existing voxel-based 3D object detectors are becoming increasingly popular but have several shortcomings. For example, during voxelization, fea...
Saved in:
Published in: | IEEE transactions on geoscience and remote sensing 2023-04, p.1-1 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | 3D object detection with LiDAR point clouds is a challenging problem which requires 3D scene understanding, yet this task is critical to autonomous driving. Existing voxel-based 3D object detectors are becoming increasingly popular but have several shortcomings. For example, during voxelization, features of distant sparse point clouds are largely discarded, which leads to the missing detection of objects. Additionally, the correlation of points between voxels and the importance of different voxels within a region are not well learned. Therefore, we present a robust network (VP-Net) that views voxels as points to accurately detect 3D objects in LiDAR point clouds and can capture objects' internal relationships. 3D CNN processing shows the output features of VP-Net as key points. The relationship between key points is then constructed into local graphs to enhance object feature extraction via a self-attention mechanism. Finally, the Euclidean distance between the extracted features guides our model's weight reassignment for strengthening the importance of neighbor points, thereby enhancing the internal feature aggregation of objects. Experiments on KITTI and nuScenes 3D object detection benchmarks demonstrate the efficiency of enhancing inter-voxel validity within object features and show that the proposed VP-Net can achieve state-of-the-art performance. |
---|---|
ISSN: | 0196-2892 |
DOI: | 10.1109/TGRS.2023.3271020 |