Loading…
Subject Independent EMG Analysis by Using Low-Cost Hardware
Thanks to the increasing interest on robotics prosthetic devices controlled by means of physiological signals, a continuously increasing number of solutions are proposed. Usually the proposed solutions are very expensive and created ad-hoc for the final user. For this reason, a large part of the pos...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Thanks to the increasing interest on robotics prosthetic devices controlled by means of physiological signals, a continuously increasing number of solutions are proposed. Usually the proposed solutions are very expensive and created ad-hoc for the final user. For this reason, a large part of the possible users can not afford this kind of technology. Furthermore, the software adaptation to the user is time consuming and physically stressing for the subject. The paper presents a low cost prosthesis framework, which covers the three fundamental aspects of a rehabilitation system, i.e. the prosthesis, the sensors used to record the physiological signals, and the software connecting the two previous points. To reduce the costs we chose a 3D printed prosthetic hand from an open-source project. We recorded Electromyography (EMG) signals from the subjects' muscles by using a low cost armband, a all-in-one solution easy to wear and remove. The EMG signals are preprocessed in order to be used online, and they are used to train a probabilistic model for classification purposes. Furthermore, the model is built on data from different subjects, in order to develop a subject-independent framework, which can be used by any subject, with no need of draining training phases. We test the goodness of our solution with a leave-one-out approach by classifying three different hand grasps. Finally, the 3D printed hand reproduces the movement performed by the subject. Data were recorded from four different subjects, each of them repeating the selected movements five times, and we obtained an overall accuracy of 76.8%. |
---|---|
ISSN: | 2577-1655 |
DOI: | 10.1109/SMC.2018.00472 |