Loading…

Improving Image-Based Visual Servoing with Three-Dimensional Features

Neither of the classical visual servoing approaches, position-based and image-based, are completely satisfactory. In position-based visual servoing the trajectory of the robot is well stated, but the approach suffers mainly from the image features going out of the visual field of the cameras. On the...

Full description

Saved in:
Bibliographic Details
Published in:The International journal of robotics research 2003-10, Vol.22 (10-11), p.821-839
Main Authors: Cervera, E., del Pobil, A. P., Berry, F., Martinet, P.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Neither of the classical visual servoing approaches, position-based and image-based, are completely satisfactory. In position-based visual servoing the trajectory of the robot is well stated, but the approach suffers mainly from the image features going out of the visual field of the cameras. On the other hand, image-based visual servoing has been found generally satisfactory and robust in the presence of camera and hand-eye calibration errors. However, in some cases, singularities and local minima may arise, and the robot can go further from its joint limits. This paper is a step towards the synthesis of both approaches with their particular advantages, i.e., the trajectory of the camera motion is predictable and the image features remain in the field of view of the camera. The basis is the introduction of three-dimensional information in the feature vector. Point depth and object pose produce useful behavior in the control of the camera. Using the task-function approach, we demonstrate the relationship between the velocity screw of the camera and the current and desired poses of the object in the camera frame. Camera calibration is assumed, at least coarsely. Experimental results on real robotic platforms illustrate the presented approach.
ISSN:0278-3649
1741-3176
DOI:10.1177/027836490302210003