Loading…

Extraction of 3D Features from Complex Environments in Visual Tracking Applications

In this paper, it is presented an algorithm for processing visual data to obtain relevant information that will be afterwards used to track the different moving objects in complex indoor environments. In autonomous robots applications, visual detection of the obstacles in a dynamic environment from...

Full description

Saved in:
Bibliographic Details
Main Authors: Marron, M., Carlos Garcia, J., Angel Sotelo, M., Pizarro Perez, D., Bravo Munoz, I.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, it is presented an algorithm for processing visual data to obtain relevant information that will be afterwards used to track the different moving objects in complex indoor environments. In autonomous robots applications, visual detection of the obstacles in a dynamic environment from a mobile platform is a complicated task. The robustness of this process is fundamental in tracking and navigation reliability for autonomous robots. The solution exposed in the document is based on a stereo-vision system; so that 3D information related to each object position in the local environment of the robot is extracted directly form the cameras. In the proposed application, all objects, both dynamic and static, in the local environment of the robot but the structure of the environment itself are considered to be obstacles. With this specification a distinction between building elements (ceiling, walls, columns and so on) and the rest of items in the robot surroundings is needed. Therefore, a classification has to be developed altogether with the detection task. On the other hand, the obtained data can be used to implement a partial reconstruction of the environmental structure that surrounds the robot. All these algorithms explained in detail in the following paragraphs and visual results are also included at the end of the paper.
ISSN:1091-5281
DOI:10.1109/IMTC.2007.379319