Loading…

Automatic scan registration using 3D linear and planar features

We present a common framework for accurate and automatic registration of two geometrically complex 3D range scans by using linear or planar features. The linear features of a range scan are extracted with an efficient split-and-merge line-fitting algorithm, which refines 2D edges extracted from the...

Full description

Saved in:
Bibliographic Details
Published in:3D research 2010-09, Vol.1 (3), Article 6
Main Authors: Yao, Jian, Ruggeri, Mauro R., Taddei, Pierluigi, Sequeira, VĂ­tor
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a common framework for accurate and automatic registration of two geometrically complex 3D range scans by using linear or planar features. The linear features of a range scan are extracted with an efficient split-and-merge line-fitting algorithm, which refines 2D edges extracted from the associated reflectance image considering the corresponding 3D depth information. The planar features are extracted employing a robust planar segmentation method, which partitions a range image into a set of planar patches. We propose an efficient probability-based RANSAC algorithm to automatically register two overlapping range scans. Our algorithm searches for matching pairs of linear (planar) features in the two range scans leading to good alignments. Line orientation (plane normal) angles and line (plane) distances formed by pairs of linear (planar) features are invariant with respect to the rigid transformation and are utilized to find candidate matches. To efficiently seek for candidate pairs and groups of matched features we build a fast search codebook. Given two sets of matched features, the rigid transformation between two scans is computed by using iterative linear optimization algorithms. The efficiency and accuracy of our registration algorithm were evaluated on several challenging range data sets.
ISSN:2092-6731
2092-6731
DOI:10.1007/3DRes.03(2010)06