Loading…

Interest point detection from multi‐beam light detection and ranging point cloud using unsupervised convolutional neural network

Interest point detection plays an important role in many computer vision applications. This work is motivated by the light detection and ranging odometry task in autonomous driving. Existing methods are not capable of detecting enough interest points in unstructured scenarios where there are little...

Full description

Saved in:
Bibliographic Details
Published in:IET image processing 2021-02, Vol.15 (2), p.369-377
Main Authors: Yin, Deyu, Zhang, Qian, Liu, Jingbin, Liang, Xinlian, Wang, Yunsheng, Chen, Shoubin, Maanpää, Jyri, Hyyppä, Juha, Chen, Ruizhi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Interest point detection plays an important role in many computer vision applications. This work is motivated by the light detection and ranging odometry task in autonomous driving. Existing methods are not capable of detecting enough interest points in unstructured scenarios where there are little constructions or trees around, and correspondingly light detection and ranging odometry will fail to continuous localisation. An interest point detector is proposed for detecting interest points from multi‐beam light detection and ranging point cloud using unsupervised convolutional neural network. The point cloud is projected into a two‐dimensional structured data according to the scanning geometry. Then the convolutional neural network filters trained in an unsupervised manner are used to generate a local feature map with the two‐dimensional structured data as input. Finally, interest points are obtained by extracting the grids that have significant differences with their neighbour grids. Based on an odometry benchmark, the experiments show that the proposed interest point detector can capture more local details, which contributes to more than 16% error decrease in point cloud registration in highway scenes.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12027