Loading…
Classifying EEG Signals during Stereoscopic Visualization to Estimate Visual Comfort
With stereoscopic displays a sensation of depth that is too strong could impede visual comfort and may result in fatigue or pain. We used Electroencephalography (EEG) to develop a novel brain-computer interface that monitors users’ states in order to reduce visual strain. We present the first system...
Saved in:
Published in: | Computational Intelligence and Neuroscience 2016-01, Vol.2016 (2016), p.463-473-037 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | With stereoscopic displays a sensation of depth that is too strong could impede visual comfort and may result in fatigue or pain. We used Electroencephalography (EEG) to develop a novel brain-computer interface that monitors users’ states in order to reduce visual strain. We present the first system that discriminates comfortable conditions from uncomfortable ones during stereoscopic vision using EEG. In particular, we show that either changes in event-related potentials’ (ERPs) amplitudes or changes in EEG oscillations power following stereoscopic objects presentation can be used to estimate visual comfort. Our system reacts within 1 s to depth variations, achieving 63% accuracy on average (up to 76%) and 74% on average when 7 consecutive variations are measured (up to 93%). Performances are stable (≈62.5%) when a simplified signal processing is used to simulate online analyses or when the number of EEG channels is lessened. This study could lead to adaptive systems that automatically suit stereoscopic displays to users and viewing conditions. For example, it could be possible to match the stereoscopic effect with users’ state by modifying the overlap of left and right images according to the classifier output. |
---|---|
ISSN: | 1687-5265 1687-5273 |
DOI: | 10.1155/2016/2758103 |