Loading…
Automated Feature Extraction on AsMap for Emotion Classification using EEG
Emotion recognition using EEG has been widely studied to address the challenges associated with affective computing. Using manual feature extraction methods on EEG signals results in sub-optimal performance by the learning models. With the advancements in deep learning as a tool for automated featur...
Saved in:
Published in: | arXiv.org 2022-03 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Emotion recognition using EEG has been widely studied to address the challenges associated with affective computing. Using manual feature extraction methods on EEG signals results in sub-optimal performance by the learning models. With the advancements in deep learning as a tool for automated feature engineering, in this work, a hybrid of manual and automatic feature extraction methods has been proposed. The asymmetry in different brain regions is captured in a 2D vector, termed the AsMap, from the differential entropy features of EEG signals. These AsMaps are then used to extract features automatically using a convolutional neural network model. The proposed feature extraction method has been compared with differential entropy and other feature extraction methods such as relative asymmetry, differential asymmetry and differential caudality. Experiments are conducted using the SJTU emotion EEG dataset and the DEAP dataset on different classification problems based on the number of classes. Results obtained indicate that the proposed method of feature extraction results in higher classification accuracy, outperforming the other feature extraction methods. The highest classification accuracy of 97.10% is achieved on a three-class classification problem using the SJTU emotion EEG dataset. Further, this work has also assessed the impact of window size on classification accuracy. |
---|---|
ISSN: | 2331-8422 |
DOI: | 10.48550/arxiv.2201.12055 |