Loading…
Geometry-Constrained EEG Channel Selection for Brain-Assisted Speech Enhancement
Brain-assisted speech enhancement (BASE) aims to extract the target speaker in complex multi-talker scenarios using electroencephalogram (EEG) signals as an assistive modality, as the auditory attention of the listener can be decoded from electroneurographic signals of the brain. This facilitates a...
Saved in:
Published in: | arXiv.org 2024-09 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Brain-assisted speech enhancement (BASE) aims to extract the target speaker in complex multi-talker scenarios using electroencephalogram (EEG) signals as an assistive modality, as the auditory attention of the listener can be decoded from electroneurographic signals of the brain. This facilitates a potential integration of EEG electrodes with listening devices to improve the speech intelligibility of hearing-impaired listeners, which was shown by the recently-proposed BASEN model. As in general the multichannel EEG signals are highly correlated and some are even irrelevant to listening, blindly incorporating all EEG channels would lead to a high economic and computational cost. In this work, we therefore propose a geometry-constrained EEG channel selection approach for BASE. We design a new weighted multi-dilation temporal convolutional network (WDTCN) as the backbone to replace the Conv-TasNet in BASEN. Given a raw channel set that is defined by the electrode geometry for feasible integration, we then propose a geometry-constrained convolutional regularization selection (GC-ConvRS) module for WD-TCN to find an informative EEG subset. Experimental results on a public dataset show the superiority of the proposed WD-TCN over BASEN. The GC-ConvRS can further refine the useful EEG subset subject to the geometry constraint, resulting in a better trade-off between performance and integration cost. |
---|---|
ISSN: | 2331-8422 |