Loading…
Subject independent emotion recognition from EEG using VMD and deep learning
Emotion recognition from Electroencephalography (EEG) is proved to be a good choice as it cannot be mimicked like speech signals or facial expressions. EEG signals of emotions are not unique and it varies from person to person as each one has different emotional responses to the same stimuli. Thus E...
Saved in:
Published in: | Journal of King Saud University. Computer and information sciences 2022-05, Vol.34 (5), p.1730-1738 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Emotion recognition from Electroencephalography (EEG) is proved to be a good choice as it cannot be mimicked like speech signals or facial expressions. EEG signals of emotions are not unique and it varies from person to person as each one has different emotional responses to the same stimuli. Thus EEG signals are subject dependent and proved to be effective for subject dependent emotion recognition. However, subject independent emotion recognition plays an important role in situations like emotion recognition from paralyzed or burnt face, where EEG of emotions of the subjects before the incidents are not available to build the emotion recognition model. Hence there is a need to identify common EEG patterns corresponds to each emotion independent of the subjects. In this paper, a subject independent emotion recognition technique is proposed from EEG signals using Variational Mode Decomposition (VMD) as a feature extraction technique and Deep Neural Network as the classifier. The performance evaluation of the proposed method with the benchmark DEAP dataset shows that the combination of VMD and Deep Neural Network performs better compared to the state of the art techniques in subject-independent emotion recognition from EEG. |
---|---|
ISSN: | 1319-1578 2213-1248 |
DOI: | 10.1016/j.jksuci.2019.11.003 |