Loading…

A convolutional neural network for robotic arm guidance using sEMG based frequency-features

Recently, robotics has been seen as a key solution to improve the quality of life of amputees. In order to create smarter robotic prosthetic devices to be used in an everyday context, one must be able to interface them seamlessly with the end-user in an inexpensive, yet reliable way. In this paper,...

Full description

Saved in:
Bibliographic Details
Main Authors: Cote Allard, Ulysse, Nougarou, Francois, Fall, Cheikh Latyr, Giguere, Philippe, Gosselin, Clement, Laviolette, Francois, Gosselin, Benoit
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, robotics has been seen as a key solution to improve the quality of life of amputees. In order to create smarter robotic prosthetic devices to be used in an everyday context, one must be able to interface them seamlessly with the end-user in an inexpensive, yet reliable way. In this paper, we are looking at guiding a robotic device by detecting gestures through measurement of the electrical activity of muscles captured by surface electromyography (sEMG). Reliable sEMG-based gesture classifiers for end-users are challenging to design, as they must be extremely robust to signal drift, muscle fatigue and small electrode displacement without the need for constant recalibration. In spite of extensive research, sophisticated sEMG classifiers for prostheses guidance are not yet widely used, as systems often fail to solve these issues simultaneously. We propose to address these problems by employing Convolutional Neural Networks. Specifically as a first step, we demonstrate their viability to the problem of gesture recognition for a low-cost, low-sampling rate (200Hz) consumer-grade, 8-channel, dry electrodes sEMG device called Myo armband (Thalmic Labs) on able-bodied subjects. To this effect, we assessed the robustness of this machine learning oriented approach by classifying a combination of 7 hand/wrist gestures with an accuracy of ~97.9% in real-time, over a period of 6 consecutive days with no recalibration. In addition, we used the classifier (in conjunction with orientation data) to guide a 6DoF robotic arm, using the armband with the same speed and precision as with a joystick. We also show that the classifier is able to generalize to different users by testing it on 18 participants.
ISSN:2153-0866
DOI:10.1109/IROS.2016.7759384