Loading…
Decoding movement kinematics from EEG using an interpretable convolutional neural network
Continuous decoding of hand kinematics has been recently explored for the intuitive control of electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs). Deep neural networks (DNNs) are emerging as powerful decoders, for their ability to automatically learn features from lightly pre-proces...
Saved in:
Published in: | Computers in biology and medicine 2023-10, Vol.165, p.107323-107323, Article 107323 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Continuous decoding of hand kinematics has been recently explored for the intuitive control of electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs). Deep neural networks (DNNs) are emerging as powerful decoders, for their ability to automatically learn features from lightly pre-processed signals. However, DNNs for kinematics decoding lack in the interpretability of the learned features and are only used to realize within-subject decoders without testing other training approaches potentially beneficial for reducing calibration time, such as transfer learning. Here, we aim to overcome these limitations by using an interpretable convolutional neural network (ICNN) to decode 2-D hand kinematics (position and velocity) from EEG in a pursuit tracking task performed by 13 participants. The ICNN is trained using both within-subject and cross-subject strategies, and also testing the feasibility of transferring the knowledge learned on other subjects on a new one. Moreover, the network eases the interpretation of learned spectral and spatial EEG features. Our ICNN outperformed most of the other state-of-the-art decoders, showing the best trade-off between performance, size, and training time. Furthermore, transfer learning improved kinematics prediction in the low data regime. The network attributed the highest relevance for decoding to the delta-band across all subjects, and to higher frequencies (alpha, beta, low-gamma) for a cluster of them; contralateral central and parieto-occipital sites were the most relevant, reflecting the involvement of sensorimotor, visual and visuo-motor processing. The approach improved the quality of kinematics prediction from the EEG, at the same time allowing interpretation of the most relevant spectral and spatial features.
•Decoding hand kinematics from EEG via an interpretable convolutional neural network.•The network has best trade-off between model performance, size, and training time.•Subject-to-subject transfer learning reduces network training time.•Network features match known spectral and spatial neural correlates of kinematics. |
---|---|
ISSN: | 0010-4825 1879-0534 |
DOI: | 10.1016/j.compbiomed.2023.107323 |