Loading…

Intelligent multi-sensor fusion techniques in flexible manufacturing workcells

This paper advances specific strategies that can be utilized to fuse data from some of the most extensively used sensors in robotic workcells viz. vision sensors and proximity sensors. Vision sensor and proximity sensor are used to obtain the workspace occupancy information. Data from these redundan...

Full description

Saved in:
Bibliographic Details
Main Authors: Kumar, M., Garg, D.P.
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper advances specific strategies that can be utilized to fuse data from some of the most extensively used sensors in robotic workcells viz. vision sensors and proximity sensors. Vision sensor and proximity sensor are used to obtain the workspace occupancy information. Data from these redundant, yet diverse, sensors have been fused using Bayesian inference to obtain an occupancy grid model of the workspace. In addition, the paper investigates the use of Kalman filtering technique to estimate the external forces acting on robot end-effector utilizing its underlying dynamics and data from force/torque (F/T) sensor mounted on the wrist of the robot. The camera to robot transformation used in the experiment is obtained via a neural network training approach. The proposed strategy to obtain transformation and data fusion is tested and validated in a robotic work cell using one ABB IRB140 six-axis revolute jointed industrial robot fitted with force/torque sensor, proximity sensor and one camera located at the top of the work cell.
ISSN:0743-1619
2378-5861
DOI:10.23919/ACC.2004.1384707