Loading…

DenseNet Based Speech Imagery EEG Signal Classification using Gramian Angular Field

One of the most challenging tasks in the Brain-Computer Interface (BCI) system is to classify the speech imagery electroencephalography (EEG) signals. In this work, we addressed the existing low classification accuracy problem with deep learning and improved beta band selection method. When the subj...

Full description

Saved in:
Bibliographic Details
Main Authors: Islam, Md. Monirul, Shuvo, Md. Maruf Hossain
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the most challenging tasks in the Brain-Computer Interface (BCI) system is to classify the speech imagery electroencephalography (EEG) signals. In this work, we addressed the existing low classification accuracy problem with deep learning and improved beta band selection method. When the subject imagines, uttering a word rather saying it directly, there are changes in electrical stimulation in the brain. These electrical stimulations of the brain are recorded using EEG signal recording device. The recorded EEG data is then processed using the Dual-Tree Complex Wavelet Transform (DTCWT) for beta band selection which is responsible for activity related to imagery. To take advantage of Deep Convolutional Neural Networks (DCNN), we converted the time series EEG data into images. We generated images using two versions of Gramian Angular Field (GAF): Gramian Summation Angular Filed (GASF) and Gramian Difference Angular Field (GADF). Then these images were fed to DenseNet for image classification. DenseNet is an improved version of DCNN that minimizes the vanishing gradient problem. Between two different image generation techniques, GADF has the best average classification accuracy rate of 90.68 %. The dataset used in this study named `The KARA ONE Database' is collected from Computational Linguistics Lab, University of Toronto, Canada.
ISSN:2378-2692
DOI:10.1109/ICAEE48663.2019.8975572