Loading…
Towards Sonomyography-Based Real-Time Control of Powered Prosthesis Grasp Synergies
Sonomyography (ultrasound imaging) offers a way of classifying complex muscle activity and configuration, with higher SNR and lower hardware requirements than sEMG, using various supervised learning algorithms. The physiological image obtained from an ultrasound probe can be used to train a classifi...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Sonomyography (ultrasound imaging) offers a way of classifying complex muscle activity and configuration, with higher SNR and lower hardware requirements than sEMG, using various supervised learning algorithms. The physiological image obtained from an ultrasound probe can be used to train a classification algorithm which can run on real time ultrasound images. The predicted values can then be mapped onto assistive or teleoperated robots. This paper describes the classification of ultrasound information and its subsequent mapping onto a soft robotic gripper as a step toward direct synergy control. Support Vector Classification algorithm has been used to classify ultrasound information into a set of defined states: open, closed, pinch and hook grasps. Once the model was trained with the ultrasound image data, real time input from the forearm was used to predict these states. The final predicted state output then set joint stiffnesses in the soft actuators, changing their interactions or synergies, to obtain the corresponding soft robotic gripper states. Data collection was carried out on five different test subjects for eight trials each. An average accuracy percentage of 93% was obtained averaged over all data. This real-time ultrasound-based control of a soft robotic gripper constitutes a promising step toward intuitive and robust biosignal-based control methods for robots. |
---|---|
ISSN: | 1558-4615 2694-0604 |
DOI: | 10.1109/EMBC44109.2020.9176483 |