Loading…

Change Their Perception: RGB-D for 3-D Modeling and Recognition

RGB-D cameras, such as Microsoft Kinect, are active sensors that provide high-resolution dense color and depth information at real-time frame rates. The wide availability of affordable RGB-D cameras is causing a revolution in perception and changing the landscape of robotics and related fields. RGB-...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics & automation magazine 2013-12, Vol.20 (4), p.49-59
Main Authors: Xiaofeng Ren, Fox, Dieter, Konolige, Kurt
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:RGB-D cameras, such as Microsoft Kinect, are active sensors that provide high-resolution dense color and depth information at real-time frame rates. The wide availability of affordable RGB-D cameras is causing a revolution in perception and changing the landscape of robotics and related fields. RGB-D perception has been the focus of a great deal of attention and many research efforts by various fields in the last three years. In this article, we summarize and discuss our ongoing research on the promising uses of RGB-D in three-dimensional (3-D) mapping and 3-D recognition. Combining the strengths of optical cameras and laser rangefinders, the joint use of color and depth in RGB-D sensing makes visual perception more robust and efficient, leading to practical systems that build detailed 3-D models of large indoor spaces, as well as systems that reliably recognize everyday objects in complex scenes. RGB-D perception is yet a burgeoning technology: a rapidly growing number of research projects are being conducted on or using RGB-D perception while RGB-D hardware quickly improves. We believe that RGB-D perception will be on the center stage of perception and, by making robots see much better than before, will enable a variety of perception-based research and applications.
ISSN:1070-9932
1558-223X
DOI:10.1109/MRA.2013.2253409