Loading…

Visual programming for accessible interactive musculoskeletal models

Musculoskeletal modeling and simulation are powerful research and education tools in engineering, neuroscience, and rehabilitation. Interactive musculoskeletal models (IMMs) can be controlled by muscle activity recorded with electromyography (EMG). IMMs are typically coded using textual programming...

Full description

Saved in:
Bibliographic Details
Published in:BMC research notes 2022-03, Vol.15 (1), p.108-108, Article 108
Main Authors: Manczurowsky, Julia, Badadhe, Mansi, Hasson, Christopher J
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Musculoskeletal modeling and simulation are powerful research and education tools in engineering, neuroscience, and rehabilitation. Interactive musculoskeletal models (IMMs) can be controlled by muscle activity recorded with electromyography (EMG). IMMs are typically coded using textual programming languages that present barriers to understanding for non-experts. The goal of this project was to use a visual programming language (Simulink) to create and test an IMM that is accessible to non-specialists for research and educational purposes. The developed IMM allows users to practice a goal-directed task with different control modes (keyboard, mouse, and EMG) and actuator types (muscle model, force generator, and torque generator). Example data were collected using both keyboard and EMG control. One male participant in his early 40's performed a goal-directed task for four sequential trials using each control mode. For EMG control, the participant used a low-cost EMG system with consumer-grade EMG sensors and an Arduino microprocessor. The participant successfully performed the task with both control modes, but the inability to grade muscle model excitation and co-activate antagonist muscles limited performance with keyboard control. The IMM developed for this project serves as a foundation that can be further tailored to specific research and education needs.
ISSN:1756-0500
1756-0500
DOI:10.1186/s13104-022-05994-5