Loading…
Multi-sensor fusion strategy to obtain 3-D occupancy profile
This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a s...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a stereo rig on the sidewall of the robotic workcell. The IR sensor is mounted on the wrist of the robot. The vision sensors on the stereo rig provide information about the three-dimensional position of any point in the robotic workspace. The IR sensor provides the distance of an object from the sensor. The information from these sensors has been fused using a probabilistic approach based on Bayesian formalism in an occupancy grid framework to obtain a 3-D occupancy model of the workspace. The proposed fusion and sensor modeling scheme is demonstrated to reduce individual sensor uncertainties and perform at a superior level as compared to some other schemes. |
---|---|
ISSN: | 1553-572X |
DOI: | 10.1109/IECON.2005.1569225 |