Loading…

Using a Variable-Friction Robot Hand to Determine Proprioceptive Features for Object Classification During Within-Hand-Manipulation

Interactions with an object during within-hand manipulation (WIHM) constitutes an assortment of gripping, sliding, and pivoting actions. In addition to manipulation benefits, the re-orientation and motion of the objects within-the-hand also provides a rich array of additional haptic information via...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on haptics 2020-07, Vol.13 (3), p.600-610
Main Authors: Spiers, Adam J., Morgan, Andrew S., Srinivasan, Krishnan, Calli, Berk, Dollar, Aaron M.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Interactions with an object during within-hand manipulation (WIHM) constitutes an assortment of gripping, sliding, and pivoting actions. In addition to manipulation benefits, the re-orientation and motion of the objects within-the-hand also provides a rich array of additional haptic information via the interactions to the sensory organs of the hand. In this article, we utilize variable friction (VF) robotic fingers to execute a rolling WIHM on a variety of objects, while recording 'proprioceptive' actuator data, which is then used for object classification (i.e., without tactile sensors). Rather than hand-picking a select group of features for this task, our approach begins with 66 general features, which are computed from actuator position and load profiles for each object-rolling manipulation, based on gradient changes. An Extra Trees classifier performs object classification while also ranking each feature's importance. Using only the six most-important 'Key Features' from the general set, a classification accuracy of 86% was achieved for distinguishing the six geometric objects included in our data set. Comparatively, when all 66 features are used, the accuracy is 89.8%.
ISSN:1939-1412
2329-4051
DOI:10.1109/TOH.2019.2958669