Loading…

Physics-Informed Attention Temporal Convolutional Network for EEG-Based Motor Imagery Classification

The brain-computer interface (BCI) is a cutting-edge technology that has the potential to change the world. Electroencephalogram (EEG) motor imagery (MI) signal has been used extensively in many BCI applications to assist disabled people, control devices or environments, and even augment human capab...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on industrial informatics 2023-02, Vol.19 (2), p.2249-2258
Main Authors: Altaheri, Hamdi, Muhammad, Ghulam, Alsulaiman, Mansour
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The brain-computer interface (BCI) is a cutting-edge technology that has the potential to change the world. Electroencephalogram (EEG) motor imagery (MI) signal has been used extensively in many BCI applications to assist disabled people, control devices or environments, and even augment human capabilities. However, the limited performance of brain signal decoding is restricting the broad growth of the BCI industry. In this article, we propose an attention-based temporal convolutional network (ATCNet) for EEG-based motor imagery classification. The ATCNet model utilizes multiple techniques to boost the performance of MI classification with a relatively small number of parameters. ATCNet employs scientific machine learning to design a domain-specific deep learning model with interpretable and explainable features, multihead self-attention to highlight the most valuable features in MI-EEG data, temporal convolutional network to extract high-level temporal features, and convolutional-based sliding window to augment the MI-EEG data efficiently. The proposed model outperforms the current state-of-the-art techniques in the BCI Competition IV-2a dataset with an accuracy of 85.38% and 70.97% for the subject-dependent and subject-independent modes, respectively.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2022.3197419