Loading…

Autonomous exploration using rapid perception of low-resolution image information

We present a technique for mobile robot exploration in unknown indoor environments using only a single forward-facing camera. Rather than processing all the data, the method intermittently examines only small 32×24 downsampled grayscale images. We show that for the task of indoor exploration the vis...

Full description

Saved in:
Bibliographic Details
Published in:Autonomous robots 2012-02, Vol.32 (2), p.115-128
Main Authors: Murali, Vidya N., Birchfield, Stanley T.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a technique for mobile robot exploration in unknown indoor environments using only a single forward-facing camera. Rather than processing all the data, the method intermittently examines only small 32×24 downsampled grayscale images. We show that for the task of indoor exploration the visual information is highly redundant, allowing successful navigation even using only a small fraction of the available data. The method keeps the robot centered in the corridor by estimating two state parameters: the orientation within the corridor, and the distance to the end of the corridor. The orientation is determined by combining the results of five complementary measures, while the estimated distance to the end combines the results of three complementary measures. These measures, which are predominantly information-theoretic, are analyzed independently, and the combined system is tested in several unknown corridor buildings exhibiting a wide variety of appearances, showing the sufficiency of low-resolution visual information for mobile robot exploration. Because the algorithm discards such a large percentage of the pixels both spatially and temporally, processing occurs at an average of 1000 frames per second, thus freeing the processor for other concurrent tasks.
ISSN:0929-5593
1573-7527
DOI:10.1007/s10514-011-9262-z