Loading…
GANSER: A Self-Supervised Data Augmentation Framework for EEG-Based Emotion Recognition
Electroencephalography (EEG)-based affective computing has a scarcity problem. As a result, it is difficult to build effective, highly accurate and stable models using machine learning algorithms, especially deep learning models. Data augmentation has recently shown performance improvements in deep...
Saved in:
Published in: | IEEE transactions on affective computing 2023-07, Vol.14 (3), p.2048-2063 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Electroencephalography (EEG)-based affective computing has a scarcity problem. As a result, it is difficult to build effective, highly accurate and stable models using machine learning algorithms, especially deep learning models. Data augmentation has recently shown performance improvements in deep learning models with increased accuracy, stability and reduced overfitting. In this paper, we propose a novel data augmentation framework, named the generative adversarial network-based self-supervised data augmentation (GANSER). As the first to combine adversarial training with self-supervised learning for EEG-based emotion recognition, the proposed framework generates high-quality and high-diversity simulated EEG samples. In particular, we utilize adversarial training to learn an EEG generator and force the generated EEG signals to approximate the distribution of real samples, ensuring the quality of the augmented samples. A transformation operation is employed to mask parts of the EEG signals and force the generator to synthesize potential EEG signals based on the unmasked parts to produce a wide variety of samples. A masking possibility during transformation is introduced as prior knowledge to generalize the classifier for the augmented sample space. Finally, numerous experiments demonstrate that our proposed method can improve emotion recognition with an increase in performance and achieve state-of-the-art results. |
---|---|
ISSN: | 1949-3045 1949-3045 |
DOI: | 10.1109/TAFFC.2022.3170369 |