Loading…

Contrastive Self-supervised EEG Representation Learning for Emotion Classification

Self-supervised learning provides an effective approach to leverage a large amount of unlabeled data. Numerous previous studies have indicated that applying self-supervision to physiological signals can yield better representations of the signals. In the paper, we aim to apply this method to the cru...

Full description

Saved in:
Bibliographic Details
Main Authors: Hu, Keya, Dai, Ren-Jie, Chen, Wen-Tao, Yin, Hao-Long, Lu, Bao-Liang, Zheng, Wei-Long
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Self-supervised learning provides an effective approach to leverage a large amount of unlabeled data. Numerous previous studies have indicated that applying self-supervision to physiological signals can yield better representations of the signals. In the paper, we aim to apply this method to the crucial field of emotion recognition. We perform the experiment with several state-of-the-art contrastive self-supervised methods to explore their effectiveness in pre-training feature encoders on raw electroencephalography (EEG) signals and fine-tuning the pre-trained encoders on the downstream emotion classification tasks. We attempt to vary the proportion of labeled data used during fine-tuning and find that the improvement from self-supervised methods is more pronounced when the proportion of labeled data is small. Additionally, we explore the transferability of the feature encoders pre-trained on various datasets and observe that most self-supervised methods exhibit a certain degree of transferability. Methods that effectively utilize the temporal information in EEG signals show superior stability, accuracy, and transferability.
ISSN:2694-0604
DOI:10.1109/EMBC53108.2024.10781579