Loading…

A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response

Brain-computer interface (BCI) based on steady-state visual evoked potential (SSVEP) has been widely studied due to the high information transfer rate (ITR), little user training, and wide subject applicability. However, there are also disadvantages such as visual discomfort and "BCI illiteracy...

Full description

Saved in:
Bibliographic Details
Published in:Frontiers in human neuroscience 2022-02, Vol.16, p.834959-834959
Main Authors: Jiang, Lu, Li, Xiaoyang, Pei, Weihua, Gao, Xiaorong, Wang, Yijun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43
cites cdi_FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43
container_end_page 834959
container_issue
container_start_page 834959
container_title Frontiers in human neuroscience
container_volume 16
creator Jiang, Lu
Li, Xiaoyang
Pei, Weihua
Gao, Xiaorong
Wang, Yijun
description Brain-computer interface (BCI) based on steady-state visual evoked potential (SSVEP) has been widely studied due to the high information transfer rate (ITR), little user training, and wide subject applicability. However, there are also disadvantages such as visual discomfort and "BCI illiteracy." To address these problems, this study proposes to use low-frequency stimulations (12 classes, 0.8-2.12 Hz with an interval of 0.12 Hz), which can simultaneously elicit visual evoked potential (VEP) and pupillary response (PR) to construct a hybrid BCI (h-BCI) system. Classification accuracy was calculated using supervised and unsupervised methods, respectively, and the hybrid accuracy was obtained using a decision fusion method to combine the information of VEP and PR. Online experimental results from 10 subjects showed that the averaged accuracy was 94.90 ± 2.34% (data length 1.5 s) for the supervised method and 91.88 ± 3.68% (data length 4 s) for the unsupervised method, which correspond to the ITR of 64.35 ± 3.07 bits/min (bpm) and 33.19 ± 2.38 bpm, respectively. Notably, the hybrid method achieved higher accuracy and ITR than that of VEP and PR for most subjects, especially for the short data length. Together with the subjects' feedback on user experience, these results indicate that the proposed h-BCI with the low-frequency stimulation paradigm is more comfortable and favorable than the traditional SSVEP-BCI paradigm using the alpha frequency range.
doi_str_mv 10.3389/fnhum.2022.834959
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_f72d1599cad64d10a4337e17c8491c91</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_f72d1599cad64d10a4337e17c8491c91</doaj_id><sourcerecordid>2625116220</sourcerecordid><originalsourceid>FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43</originalsourceid><addsrcrecordid>eNpdUk1v1DAQjRCIlsIP4IIiceGSxeOvxBekdlXalSqBECBxshx_tF4SO7WTSv33eLtt1XKxPc9vnmfGr6reA1oR0onPLlwt4wojjFcdoYKJF9UhcI4bBhxePjkfVG9y3iLEMWfwujogDDrGEDqs_hzX57d98qY-ScqHZh3HaZltqjehrE5pW5-obE0dQ_3b50UN9elN_FuA73G2YfYFUKFEy-SHQaXb-ofNUwzZvq1eOTVk--5-P6p-fT39uT5vLr6dbdbHF42mgswN64xj1CIwqKPCsb43iiOskWuJaakmzGnRW9ohYxX0xPEejOBUYQvYaEqOqs1e10S1lVPyY6lCRuXlHRDTpVRp9nqw0rXYABNCK8OpAaQoIa2FVpeXQQsoWl_2WtPSj9bo0mBSwzPR5zfBX8nLeCO7jiHckiLw6V4gxevF5lmOPmtbJhNsXLLEnADHrOWoUD_-R93GJYUyqsLCDAoP71iwZ-kUc07WPRYDSO5MIO9MIHcmkHsTlJwPT7t4zHj4dfIPiD-ugQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2625116220</pqid></control><display><type>article</type><title>A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response</title><source>PubMed Central</source><creator>Jiang, Lu ; Li, Xiaoyang ; Pei, Weihua ; Gao, Xiaorong ; Wang, Yijun</creator><creatorcontrib>Jiang, Lu ; Li, Xiaoyang ; Pei, Weihua ; Gao, Xiaorong ; Wang, Yijun</creatorcontrib><description>Brain-computer interface (BCI) based on steady-state visual evoked potential (SSVEP) has been widely studied due to the high information transfer rate (ITR), little user training, and wide subject applicability. However, there are also disadvantages such as visual discomfort and "BCI illiteracy." To address these problems, this study proposes to use low-frequency stimulations (12 classes, 0.8-2.12 Hz with an interval of 0.12 Hz), which can simultaneously elicit visual evoked potential (VEP) and pupillary response (PR) to construct a hybrid BCI (h-BCI) system. Classification accuracy was calculated using supervised and unsupervised methods, respectively, and the hybrid accuracy was obtained using a decision fusion method to combine the information of VEP and PR. Online experimental results from 10 subjects showed that the averaged accuracy was 94.90 ± 2.34% (data length 1.5 s) for the supervised method and 91.88 ± 3.68% (data length 4 s) for the unsupervised method, which correspond to the ITR of 64.35 ± 3.07 bits/min (bpm) and 33.19 ± 2.38 bpm, respectively. Notably, the hybrid method achieved higher accuracy and ITR than that of VEP and PR for most subjects, especially for the short data length. Together with the subjects' feedback on user experience, these results indicate that the proposed h-BCI with the low-frequency stimulation paradigm is more comfortable and favorable than the traditional SSVEP-BCI paradigm using the alpha frequency range.</description><identifier>ISSN: 1662-5161</identifier><identifier>EISSN: 1662-5161</identifier><identifier>DOI: 10.3389/fnhum.2022.834959</identifier><identifier>PMID: 35185500</identifier><language>eng</language><publisher>Switzerland: Frontiers Research Foundation</publisher><subject>Accuracy ; BCI illiteracy ; Brain ; Brain research ; Classification ; Communication ; Computer applications ; Correlation analysis ; electroencephalogram ; Electroencephalography ; Experiments ; Human-computer interaction ; hybrid brain-computer interface ; Implants ; Neuroscience ; pupillary response ; task-related component analysis ; User experience ; visual evoked potential ; Visual evoked potentials</subject><ispartof>Frontiers in human neuroscience, 2022-02, Vol.16, p.834959-834959</ispartof><rights>Copyright © 2022 Jiang, Li, Pei, Gao and Wang.</rights><rights>2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>Copyright © 2022 Jiang, Li, Pei, Gao and Wang. 2022 Jiang, Li, Pei, Gao and Wang</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43</citedby><cites>FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8850273/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8850273/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35185500$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Jiang, Lu</creatorcontrib><creatorcontrib>Li, Xiaoyang</creatorcontrib><creatorcontrib>Pei, Weihua</creatorcontrib><creatorcontrib>Gao, Xiaorong</creatorcontrib><creatorcontrib>Wang, Yijun</creatorcontrib><title>A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response</title><title>Frontiers in human neuroscience</title><addtitle>Front Hum Neurosci</addtitle><description>Brain-computer interface (BCI) based on steady-state visual evoked potential (SSVEP) has been widely studied due to the high information transfer rate (ITR), little user training, and wide subject applicability. However, there are also disadvantages such as visual discomfort and "BCI illiteracy." To address these problems, this study proposes to use low-frequency stimulations (12 classes, 0.8-2.12 Hz with an interval of 0.12 Hz), which can simultaneously elicit visual evoked potential (VEP) and pupillary response (PR) to construct a hybrid BCI (h-BCI) system. Classification accuracy was calculated using supervised and unsupervised methods, respectively, and the hybrid accuracy was obtained using a decision fusion method to combine the information of VEP and PR. Online experimental results from 10 subjects showed that the averaged accuracy was 94.90 ± 2.34% (data length 1.5 s) for the supervised method and 91.88 ± 3.68% (data length 4 s) for the unsupervised method, which correspond to the ITR of 64.35 ± 3.07 bits/min (bpm) and 33.19 ± 2.38 bpm, respectively. Notably, the hybrid method achieved higher accuracy and ITR than that of VEP and PR for most subjects, especially for the short data length. Together with the subjects' feedback on user experience, these results indicate that the proposed h-BCI with the low-frequency stimulation paradigm is more comfortable and favorable than the traditional SSVEP-BCI paradigm using the alpha frequency range.</description><subject>Accuracy</subject><subject>BCI illiteracy</subject><subject>Brain</subject><subject>Brain research</subject><subject>Classification</subject><subject>Communication</subject><subject>Computer applications</subject><subject>Correlation analysis</subject><subject>electroencephalogram</subject><subject>Electroencephalography</subject><subject>Experiments</subject><subject>Human-computer interaction</subject><subject>hybrid brain-computer interface</subject><subject>Implants</subject><subject>Neuroscience</subject><subject>pupillary response</subject><subject>task-related component analysis</subject><subject>User experience</subject><subject>visual evoked potential</subject><subject>Visual evoked potentials</subject><issn>1662-5161</issn><issn>1662-5161</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpdUk1v1DAQjRCIlsIP4IIiceGSxeOvxBekdlXalSqBECBxshx_tF4SO7WTSv33eLtt1XKxPc9vnmfGr6reA1oR0onPLlwt4wojjFcdoYKJF9UhcI4bBhxePjkfVG9y3iLEMWfwujogDDrGEDqs_hzX57d98qY-ScqHZh3HaZltqjehrE5pW5-obE0dQ_3b50UN9elN_FuA73G2YfYFUKFEy-SHQaXb-ofNUwzZvq1eOTVk--5-P6p-fT39uT5vLr6dbdbHF42mgswN64xj1CIwqKPCsb43iiOskWuJaakmzGnRW9ohYxX0xPEejOBUYQvYaEqOqs1e10S1lVPyY6lCRuXlHRDTpVRp9nqw0rXYABNCK8OpAaQoIa2FVpeXQQsoWl_2WtPSj9bo0mBSwzPR5zfBX8nLeCO7jiHckiLw6V4gxevF5lmOPmtbJhNsXLLEnADHrOWoUD_-R93GJYUyqsLCDAoP71iwZ-kUc07WPRYDSO5MIO9MIHcmkHsTlJwPT7t4zHj4dfIPiD-ugQ</recordid><startdate>20220203</startdate><enddate>20220203</enddate><creator>Jiang, Lu</creator><creator>Li, Xiaoyang</creator><creator>Pei, Weihua</creator><creator>Gao, Xiaorong</creator><creator>Wang, Yijun</creator><general>Frontiers Research Foundation</general><general>Frontiers Media S.A</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7XB</scope><scope>88I</scope><scope>8FE</scope><scope>8FH</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>LK8</scope><scope>M2P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20220203</creationdate><title>A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response</title><author>Jiang, Lu ; Li, Xiaoyang ; Pei, Weihua ; Gao, Xiaorong ; Wang, Yijun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>BCI illiteracy</topic><topic>Brain</topic><topic>Brain research</topic><topic>Classification</topic><topic>Communication</topic><topic>Computer applications</topic><topic>Correlation analysis</topic><topic>electroencephalogram</topic><topic>Electroencephalography</topic><topic>Experiments</topic><topic>Human-computer interaction</topic><topic>hybrid brain-computer interface</topic><topic>Implants</topic><topic>Neuroscience</topic><topic>pupillary response</topic><topic>task-related component analysis</topic><topic>User experience</topic><topic>visual evoked potential</topic><topic>Visual evoked potentials</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jiang, Lu</creatorcontrib><creatorcontrib>Li, Xiaoyang</creatorcontrib><creatorcontrib>Pei, Weihua</creatorcontrib><creatorcontrib>Gao, Xiaorong</creatorcontrib><creatorcontrib>Wang, Yijun</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Biological Science Collection</collection><collection>ProQuest Science Journals</collection><collection>Biological Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>Frontiers in human neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jiang, Lu</au><au>Li, Xiaoyang</au><au>Pei, Weihua</au><au>Gao, Xiaorong</au><au>Wang, Yijun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response</atitle><jtitle>Frontiers in human neuroscience</jtitle><addtitle>Front Hum Neurosci</addtitle><date>2022-02-03</date><risdate>2022</risdate><volume>16</volume><spage>834959</spage><epage>834959</epage><pages>834959-834959</pages><issn>1662-5161</issn><eissn>1662-5161</eissn><abstract>Brain-computer interface (BCI) based on steady-state visual evoked potential (SSVEP) has been widely studied due to the high information transfer rate (ITR), little user training, and wide subject applicability. However, there are also disadvantages such as visual discomfort and "BCI illiteracy." To address these problems, this study proposes to use low-frequency stimulations (12 classes, 0.8-2.12 Hz with an interval of 0.12 Hz), which can simultaneously elicit visual evoked potential (VEP) and pupillary response (PR) to construct a hybrid BCI (h-BCI) system. Classification accuracy was calculated using supervised and unsupervised methods, respectively, and the hybrid accuracy was obtained using a decision fusion method to combine the information of VEP and PR. Online experimental results from 10 subjects showed that the averaged accuracy was 94.90 ± 2.34% (data length 1.5 s) for the supervised method and 91.88 ± 3.68% (data length 4 s) for the unsupervised method, which correspond to the ITR of 64.35 ± 3.07 bits/min (bpm) and 33.19 ± 2.38 bpm, respectively. Notably, the hybrid method achieved higher accuracy and ITR than that of VEP and PR for most subjects, especially for the short data length. Together with the subjects' feedback on user experience, these results indicate that the proposed h-BCI with the low-frequency stimulation paradigm is more comfortable and favorable than the traditional SSVEP-BCI paradigm using the alpha frequency range.</abstract><cop>Switzerland</cop><pub>Frontiers Research Foundation</pub><pmid>35185500</pmid><doi>10.3389/fnhum.2022.834959</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1662-5161
ispartof Frontiers in human neuroscience, 2022-02, Vol.16, p.834959-834959
issn 1662-5161
1662-5161
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_f72d1599cad64d10a4337e17c8491c91
source PubMed Central
subjects Accuracy
BCI illiteracy
Brain
Brain research
Classification
Communication
Computer applications
Correlation analysis
electroencephalogram
Electroencephalography
Experiments
Human-computer interaction
hybrid brain-computer interface
Implants
Neuroscience
pupillary response
task-related component analysis
User experience
visual evoked potential
Visual evoked potentials
title A Hybrid Brain-Computer Interface Based on Visual Evoked Potential and Pupillary Response
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T16%3A04%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Hybrid%20Brain-Computer%20Interface%20Based%20on%20Visual%20Evoked%20Potential%20and%20Pupillary%20Response&rft.jtitle=Frontiers%20in%20human%20neuroscience&rft.au=Jiang,%20Lu&rft.date=2022-02-03&rft.volume=16&rft.spage=834959&rft.epage=834959&rft.pages=834959-834959&rft.issn=1662-5161&rft.eissn=1662-5161&rft_id=info:doi/10.3389/fnhum.2022.834959&rft_dat=%3Cproquest_doaj_%3E2625116220%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c493t-58df54e01d0849f5bbda602c0f73d74c35fc9be480dea1b3f6b1d964a2e12dc43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2625116220&rft_id=info:pmid/35185500&rfr_iscdi=true