Loading…

Teaching Mobile Robots Using Custom-Made Tools by a Semi-Direct Method

[abstFig src='/00280002/15.jpg' width=""300"" text='Teach grasping point by custom-made tool' ]We propose a method for conveying human knowledge to home and office assistance robots by teaching them how to perform the process of grasping objects with a custom-...

Full description

Saved in:
Bibliographic Details
Published in:Journal of robotics and mechatronics 2016-04, Vol.28 (2), p.242-254
Main Authors: Heredia, Jorge David Figueroa, Sahloul, Hamdi, Ota, Jun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:[abstFig src='/00280002/15.jpg' width=""300"" text='Teach grasping point by custom-made tool' ]We propose a method for conveying human knowledge to home and office assistance robots by teaching them how to perform the process of grasping objects with a custom-made tool. Specifically, we propose a semi-direct teaching method that respects the limitations of the hardware on the robot while utilizing human experience for intuitive teaching. We specify the information necessary for grasping objects through the generation of teaching data, which include the grasping force, relative position, and orientation. To respect the hardware limitations and at the same time allow inexperienced users to perform the teaching process easily, we used a teaching tool that possesses the same mechanism as the end effector of the robot. To simplify the teaching, we developed a sensing system that would reduce the teaching time with accurate measurements. Subsequently, the robot would use the teaching data to grasp the object. Experiments conducted using volunteers demonstrated the validity of the proposed method, wherein the teaching data for three different tasks were generated in less than 30 s each and accurate measurements were obtained for both the grasping position and force for grasping the objects.
ISSN:0915-3942
1883-8049
DOI:10.20965/jrm.2016.p0242