Loading…

Interactive light source position estimation for augmented reality with an RGB‐D camera

The first hybrid CPU‐GPU based method for estimating a point light source position in a scene recorded by an RGB‐D camera is presented. The image and depth information from the Kinect is enough to estimate a light position in a scene, which allows for the rendering of synthetic objects into a scene...

Full description

Saved in:
Bibliographic Details
Published in:Computer animation and virtual worlds 2017-01, Vol.28 (1), p.np-n/a
Main Authors: Boom, Bastiaan J., Orts‐Escolano, Sergio, Ning, Xin X., McDonagh, Steven, Sandilands, Peter, Fisher, Robert B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The first hybrid CPU‐GPU based method for estimating a point light source position in a scene recorded by an RGB‐D camera is presented. The image and depth information from the Kinect is enough to estimate a light position in a scene, which allows for the rendering of synthetic objects into a scene that appears realistic enough for augmented reality purposes. This method does not require a light probe or other physical device. To make this method suitable for augmented reality, we developed a hybrid implementation that performs light estimation in under 1second. This is sufficient for most augmented reality scenarios because both the position of the light source and the position of the Kinect are typically fixed. The method is able to estimate the angle of the light source with an average error of 20°. By rendering synthetic objects into the recorded scene, we illustrate that this accuracy is good enough for the rendered objects to look realistic. Copyright © 2015 John Wiley & Sons, Ltd. A synthetic object (dragon, right) that is rendered into a real‐world scene recorded with the RGB‐D sensor, where the illumination and rendered shadow of the synthetic object are similar to the scene based on an estimated light source position determined using only the intensity image and depth information.
ISSN:1546-4261
1546-427X
DOI:10.1002/cav.1686