Loading…

Omnidirectional texturing based on robust 3D registration through Euclidean reconstruction from two spherical images

We propose a semi-automatic omnidirectional texturing method that maps a spherical image onto a dense 3D model obtained by a range sensor. For accurate texturing, accurate estimation of the extrinsic parameters is inevitable. In order to estimate these parameters, we propose a robust 3D registration...

Full description

Saved in:
Bibliographic Details
Published in:Computer vision and image understanding 2010-04, Vol.114 (4), p.491-499
Main Authors: Banno, Atsuhiko, Ikeuchi, Katsushi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a semi-automatic omnidirectional texturing method that maps a spherical image onto a dense 3D model obtained by a range sensor. For accurate texturing, accurate estimation of the extrinsic parameters is inevitable. In order to estimate these parameters, we propose a robust 3D registration-based method between a dense range data set and a sparse spherical image stereo data set. For measuring the distances between the two data sets, we introduce generalized distances taking account of 3D error distributions of the stereo data. To reconstruct 3D models by images, we use two spherical images taken at arbitrary positions in arbitrary poses. Then, we propose a novel rectification method for spherical images that is derived from E matrix and facilitates the estimation of the disparities. The experimental results show that the proposed method can map the spherical image onto the dense 3D models effectively and accurately.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2009.12.005