Loading…
Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction
Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based...
Saved in:
Published in: | Psychology in Russia : state of the art 2017-01, Vol.10 (3), p.120-137 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c373t-94e56d6f09f1b6e1c87627a85ddb43e85702687225b6dee83e80bd2c1e87c0fa3 |
---|---|
cites | |
container_end_page | 137 |
container_issue | 3 |
container_start_page | 120 |
container_title | Psychology in Russia : state of the art |
container_volume | 10 |
creator | Shishkin, Sergei L. Zhao, Darisii G. Isachenko, Andrei V. Velichkovsky, Boris M. |
description | Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI) interface will have a chance to enable natural, fluent, and the effortless interaction with machines in various fields of application. |
doi_str_mv | 10.11621/pir.2017.0308 |
format | article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_a84b5cb7a0a24697b3e08a446fea6652</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_a84b5cb7a0a24697b3e08a446fea6652</doaj_id><sourcerecordid>2168187102</sourcerecordid><originalsourceid>FETCH-LOGICAL-c373t-94e56d6f09f1b6e1c87627a85ddb43e85702687225b6dee83e80bd2c1e87c0fa3</originalsourceid><addsrcrecordid>eNo9kc1Lw0AQxYMoWLRXzwHPG_d7t0cpWgsFD-p5mf2IpqTZuEkO-te7aYunGd68eTPwK4o7gitCJCUPfZMqiomqMMP6olhQhhWiFNPL3GPFkdRCXRfLYWgs5lwJJQhbFG8b-A0IOo9sgqZDLnZjim0bfNl0Y0g1uDCUdUzl13SAeX7op6yXeeUspWjjeHKDG5vY3RZXNbRDWJ7rTfHx_PS-fkG71812_bhDjik2ohUPQnpZ41VNrAzEaSWpAi28t5yF_C6mUitKhZU-BJ0lbD11JGjlcA3sptiecn2EvelTc4D0YyI05ijE9GkgjY1rgwHNrXBWAQbK5UpZFrAGzmUdQEpBc9b9KatP8XsKw2j2cUpdft9QIjXRiuDZVZ1cLsVhSKH-v0qwOXIwmYOZOZiZA_sDSaZ7YQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2168187102</pqid></control><display><type>article</type><title>Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction</title><source>Publicly Available Content Database</source><source>IngentaConnect Journals</source><creator>Shishkin, Sergei L. ; Zhao, Darisii G. ; Isachenko, Andrei V. ; Velichkovsky, Boris M.</creator><creatorcontrib>Shishkin, Sergei L. ; Zhao, Darisii G. ; Isachenko, Andrei V. ; Velichkovsky, Boris M.</creatorcontrib><description>Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI) interface will have a chance to enable natural, fluent, and the effortless interaction with machines in various fields of application.</description><identifier>ISSN: 2074-6857</identifier><identifier>EISSN: 2307-2202</identifier><identifier>DOI: 10.11621/pir.2017.0308</identifier><language>eng</language><publisher>Moscow: Russian Psychological Society</publisher><subject>attention ; brain output pathways ; brain-computer interface (BCI) ; electroencephalography (EEG) ; expectancy wave (E-wave) ; eye movements ; eye-brain-computer interface (EBCI) ; eye-to-eye contact ; human-robot interaction ; Robots</subject><ispartof>Psychology in Russia : state of the art, 2017-01, Vol.10 (3), p.120-137</ispartof><rights>2017. This work is published under http://creativecommons.org/licenses/by-nc/4.0 (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c373t-94e56d6f09f1b6e1c87627a85ddb43e85702687225b6dee83e80bd2c1e87c0fa3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2168187102/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2168187102?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,25753,27924,27925,37012,44590,75126</link.rule.ids></links><search><creatorcontrib>Shishkin, Sergei L.</creatorcontrib><creatorcontrib>Zhao, Darisii G.</creatorcontrib><creatorcontrib>Isachenko, Andrei V.</creatorcontrib><creatorcontrib>Velichkovsky, Boris M.</creatorcontrib><title>Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction</title><title>Psychology in Russia : state of the art</title><description>Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI) interface will have a chance to enable natural, fluent, and the effortless interaction with machines in various fields of application.</description><subject>attention</subject><subject>brain output pathways</subject><subject>brain-computer interface (BCI)</subject><subject>electroencephalography (EEG)</subject><subject>expectancy wave (E-wave)</subject><subject>eye movements</subject><subject>eye-brain-computer interface (EBCI)</subject><subject>eye-to-eye contact</subject><subject>human-robot interaction</subject><subject>Robots</subject><issn>2074-6857</issn><issn>2307-2202</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNo9kc1Lw0AQxYMoWLRXzwHPG_d7t0cpWgsFD-p5mf2IpqTZuEkO-te7aYunGd68eTPwK4o7gitCJCUPfZMqiomqMMP6olhQhhWiFNPL3GPFkdRCXRfLYWgs5lwJJQhbFG8b-A0IOo9sgqZDLnZjim0bfNl0Y0g1uDCUdUzl13SAeX7op6yXeeUspWjjeHKDG5vY3RZXNbRDWJ7rTfHx_PS-fkG71812_bhDjik2ohUPQnpZ41VNrAzEaSWpAi28t5yF_C6mUitKhZU-BJ0lbD11JGjlcA3sptiecn2EvelTc4D0YyI05ijE9GkgjY1rgwHNrXBWAQbK5UpZFrAGzmUdQEpBc9b9KatP8XsKw2j2cUpdft9QIjXRiuDZVZ1cLsVhSKH-v0qwOXIwmYOZOZiZA_sDSaZ7YQ</recordid><startdate>20170101</startdate><enddate>20170101</enddate><creator>Shishkin, Sergei L.</creator><creator>Zhao, Darisii G.</creator><creator>Isachenko, Andrei V.</creator><creator>Velichkovsky, Boris M.</creator><general>Russian Psychological Society</general><general>M.V. Lomonosov Moscow State University</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7XB</scope><scope>88G</scope><scope>88I</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>M2M</scope><scope>M2P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>DOA</scope></search><sort><creationdate>20170101</creationdate><title>Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction</title><author>Shishkin, Sergei L. ; Zhao, Darisii G. ; Isachenko, Andrei V. ; Velichkovsky, Boris M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c373t-94e56d6f09f1b6e1c87627a85ddb43e85702687225b6dee83e80bd2c1e87c0fa3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>attention</topic><topic>brain output pathways</topic><topic>brain-computer interface (BCI)</topic><topic>electroencephalography (EEG)</topic><topic>expectancy wave (E-wave)</topic><topic>eye movements</topic><topic>eye-brain-computer interface (EBCI)</topic><topic>eye-to-eye contact</topic><topic>human-robot interaction</topic><topic>Robots</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shishkin, Sergei L.</creatorcontrib><creatorcontrib>Zhao, Darisii G.</creatorcontrib><creatorcontrib>Isachenko, Andrei V.</creatorcontrib><creatorcontrib>Velichkovsky, Boris M.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Psychology Database (Alumni)</collection><collection>Science Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>Psychology Database</collection><collection>Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>Directory of Open Access Journals</collection><jtitle>Psychology in Russia : state of the art</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shishkin, Sergei L.</au><au>Zhao, Darisii G.</au><au>Isachenko, Andrei V.</au><au>Velichkovsky, Boris M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction</atitle><jtitle>Psychology in Russia : state of the art</jtitle><date>2017-01-01</date><risdate>2017</risdate><volume>10</volume><issue>3</issue><spage>120</spage><epage>137</epage><pages>120-137</pages><issn>2074-6857</issn><eissn>2307-2202</eissn><abstract>Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI) interface will have a chance to enable natural, fluent, and the effortless interaction with machines in various fields of application.</abstract><cop>Moscow</cop><pub>Russian Psychological Society</pub><doi>10.11621/pir.2017.0308</doi><tpages>18</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2074-6857 |
ispartof | Psychology in Russia : state of the art, 2017-01, Vol.10 (3), p.120-137 |
issn | 2074-6857 2307-2202 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_a84b5cb7a0a24697b3e08a446fea6652 |
source | Publicly Available Content Database; IngentaConnect Journals |
subjects | attention brain output pathways brain-computer interface (BCI) electroencephalography (EEG) expectancy wave (E-wave) eye movements eye-brain-computer interface (EBCI) eye-to-eye contact human-robot interaction Robots |
title | Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T01%3A03%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Gaze-and-brain-controlled%20interfaces%20for%20human-computer%20and%20human-robot%20interaction&rft.jtitle=Psychology%20in%20Russia%20:%20state%20of%20the%20art&rft.au=Shishkin,%20Sergei%20L.&rft.date=2017-01-01&rft.volume=10&rft.issue=3&rft.spage=120&rft.epage=137&rft.pages=120-137&rft.issn=2074-6857&rft.eissn=2307-2202&rft_id=info:doi/10.11621/pir.2017.0308&rft_dat=%3Cproquest_doaj_%3E2168187102%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c373t-94e56d6f09f1b6e1c87627a85ddb43e85702687225b6dee83e80bd2c1e87c0fa3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2168187102&rft_id=info:pmid/&rfr_iscdi=true |