Loading…
Tactile control based on Gaussian images and its application in bi-manual manipulation of deformable objects
The field of in-hand robot manipulation of deformable objects is an open and key issue for the next-coming robots. Developing an adaptable and agile framework for the tasks where a robot grasps and manipulates different kinds of deformable objects, is a main goal in the literature. Many research wor...
Saved in:
Published in: | Robotics and autonomous systems 2017-08, Vol.94, p.148-161 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The field of in-hand robot manipulation of deformable objects is an open and key issue for the next-coming robots. Developing an adaptable and agile framework for the tasks where a robot grasps and manipulates different kinds of deformable objects, is a main goal in the literature. Many research works have been proposed to control the manipulation tasks using a model of the manipulated object. Despite these techniques are precise to model the deformations, they are time consuming and, using them in real environments is almost impossible because of the large amount of objects which the robot could find. In this paper, we propose a model-independent framework to control the movements of the fingers of the hands while the robot executes manipulation tasks with deformable objects. This technique is based on tactile images which are obtained as a common interface for different tactile sensors, and uses a servo-tactile control to stabilize the grasping points, avoid sliding and adapt the contacts’ configuration regarding to position and magnitude of the applied force. Tactile images are obtained using a combination of dynamic Gaussians, which allows the creation of a common representation for tactile data given by different sensors with different technologies and resolutions. The framework was tested on different manipulation tasks where the objects are deformed, and without using a model of them. |
---|---|
ISSN: | 0921-8890 1872-793X |
DOI: | 10.1016/j.robot.2017.04.017 |