Loading…
Learning to grasp familiar objects using object view recognition and template matching
Robots are still not able to grasp all unforeseen objects. Finding a proper grasp configuration, i.e. the position and orientation of the arm relative to the object, is still challenging. One approach for grasping unforeseen objects is to recognize an appropriate grasp configuration from previous gr...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Robots are still not able to grasp all unforeseen objects. Finding a proper grasp configuration, i.e. the position and orientation of the arm relative to the object, is still challenging. One approach for grasping unforeseen objects is to recognize an appropriate grasp configuration from previous grasp demonstrations. The underlying assumption in this approach is that new objects that are similar to known ones (i.e. they are familiar) can be grasped in a similar way. However finding a grasp representation and a grasp similarity metric is still the main challenge in developing an approach for grasping familiar objects. In this paper, interactive object view learning and recognition capabilities are integrated in the process of learning and recognizing grasps. The object view recognition module uses an interactive incremental learning approach to recognize object view labels. The grasp pose learning approach uses local and global visual features of a demonstrated grasp to learn a grasp template associated with the recognized object view. A grasp distance measure based on Mahalanobis distance is used in a grasp template matching approach to recognize an appropriate grasp pose. The experimental results demonstrate the high reliability of the developed template matching approach in recognizing the grasp poses. Experimental results also show how the robot can incrementally improve its performance in grasping familiar objects. |
---|---|
ISSN: | 2153-0866 |
DOI: | 10.1109/IROS.2016.7759448 |