Loading…
A Data-Driven Approach for Gaze Tracking
Gaze tracking presents an intuitive interface for technology in today's society, with its application focus in controlling electronic devices. This paper concentrates on the design and application of an automatic gaze tracking system utilizing commodity equipment. Compared to preceding low-cost...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 499 |
container_issue | |
container_start_page | 494 |
container_title | |
container_volume | |
creator | Huang, Kevin Khalil, Mahmoud Luciani, Evelyn Melesse, Daniel Ning, Taikang |
description | Gaze tracking presents an intuitive interface for technology in today's society, with its application focus in controlling electronic devices. This paper concentrates on the design and application of an automatic gaze tracking system utilizing commodity equipment. Compared to preceding low-cost methods, the proposed method is significantly simpler, lowering the barrier of entry for this type of device, and can potentially afford more accurate tracking. Through the careful placement of the infrared (IR) light-emitting-diodes (LEDs) on the monitor and coaxially to the optical axis of the camera, the pupil was illuminated and reference glints became visible on the cornea. These glints were captured by a camera capable of detecting IR light, and were used to determine the users line of sight relative to the monitor. A linear model was used to address the horizontal and vertical components of the glints in the users eye and match them to the corresponding location point on the monitor. K-means clustering was utilized to classify the separate gaze regions with promising results. |
doi_str_mv | 10.1109/ICSP.2018.8652292 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_8652292</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8652292</ieee_id><sourcerecordid>8652292</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-32b1cbd7bc6111c9cf469374e989469f32f61ada29c810113aee3083b9f1a18c3</originalsourceid><addsrcrecordid>eNotj81KAzEURqMgWGsfQNxk6SZjbu7MnWQ5TLUWCi1Y1-VOmmj8aYdMEfTpLdjVd-DAgU-IG9AFgHb38_Z5VRgNtrBUGePMmbiCCi2VVKM-FyMDVKqjgUsxGYZ3rTWCtYQ0EneNnPKB1TSn77CTTd_nPfs3GfdZzvg3yHVm_5F2r9fiIvLnECanHYuXx4d1-6QWy9m8bRYqQV0dFJoOfLetO08A4J2PJTmsy-CsO1JEEwl4y8Z5CxoAOQTUFjsXgcF6HIvb_24KIWz6nL44_2xOx_APZNo_Rg</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>A Data-Driven Approach for Gaze Tracking</title><source>IEEE Xplore All Conference Series</source><creator>Huang, Kevin ; Khalil, Mahmoud ; Luciani, Evelyn ; Melesse, Daniel ; Ning, Taikang</creator><creatorcontrib>Huang, Kevin ; Khalil, Mahmoud ; Luciani, Evelyn ; Melesse, Daniel ; Ning, Taikang</creatorcontrib><description>Gaze tracking presents an intuitive interface for technology in today's society, with its application focus in controlling electronic devices. This paper concentrates on the design and application of an automatic gaze tracking system utilizing commodity equipment. Compared to preceding low-cost methods, the proposed method is significantly simpler, lowering the barrier of entry for this type of device, and can potentially afford more accurate tracking. Through the careful placement of the infrared (IR) light-emitting-diodes (LEDs) on the monitor and coaxially to the optical axis of the camera, the pupil was illuminated and reference glints became visible on the cornea. These glints were captured by a camera capable of detecting IR light, and were used to determine the users line of sight relative to the monitor. A linear model was used to address the horizontal and vertical components of the glints in the users eye and match them to the corresponding location point on the monitor. K-means clustering was utilized to classify the separate gaze regions with promising results.</description><identifier>EISSN: 2164-5221</identifier><identifier>EISBN: 1538646730</identifier><identifier>EISBN: 9781538646717</identifier><identifier>EISBN: 9781538646731</identifier><identifier>EISBN: 1538646714</identifier><identifier>DOI: 10.1109/ICSP.2018.8652292</identifier><language>eng</language><publisher>IEEE</publisher><subject>automatic gaze tracking ; Cameras ; classification ; Cornea ; Gaze tracking ; human computer interface ; k-means clustering ; Lenses ; Light emitting diodes ; Lighting ; Monitoring</subject><ispartof>2018 14th IEEE International Conference on Signal Processing (ICSP), 2018, p.494-499</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8652292$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,23930,23931,25140,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/8652292$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Huang, Kevin</creatorcontrib><creatorcontrib>Khalil, Mahmoud</creatorcontrib><creatorcontrib>Luciani, Evelyn</creatorcontrib><creatorcontrib>Melesse, Daniel</creatorcontrib><creatorcontrib>Ning, Taikang</creatorcontrib><title>A Data-Driven Approach for Gaze Tracking</title><title>2018 14th IEEE International Conference on Signal Processing (ICSP)</title><addtitle>ICSP</addtitle><description>Gaze tracking presents an intuitive interface for technology in today's society, with its application focus in controlling electronic devices. This paper concentrates on the design and application of an automatic gaze tracking system utilizing commodity equipment. Compared to preceding low-cost methods, the proposed method is significantly simpler, lowering the barrier of entry for this type of device, and can potentially afford more accurate tracking. Through the careful placement of the infrared (IR) light-emitting-diodes (LEDs) on the monitor and coaxially to the optical axis of the camera, the pupil was illuminated and reference glints became visible on the cornea. These glints were captured by a camera capable of detecting IR light, and were used to determine the users line of sight relative to the monitor. A linear model was used to address the horizontal and vertical components of the glints in the users eye and match them to the corresponding location point on the monitor. K-means clustering was utilized to classify the separate gaze regions with promising results.</description><subject>automatic gaze tracking</subject><subject>Cameras</subject><subject>classification</subject><subject>Cornea</subject><subject>Gaze tracking</subject><subject>human computer interface</subject><subject>k-means clustering</subject><subject>Lenses</subject><subject>Light emitting diodes</subject><subject>Lighting</subject><subject>Monitoring</subject><issn>2164-5221</issn><isbn>1538646730</isbn><isbn>9781538646717</isbn><isbn>9781538646731</isbn><isbn>1538646714</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2018</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotj81KAzEURqMgWGsfQNxk6SZjbu7MnWQ5TLUWCi1Y1-VOmmj8aYdMEfTpLdjVd-DAgU-IG9AFgHb38_Z5VRgNtrBUGePMmbiCCi2VVKM-FyMDVKqjgUsxGYZ3rTWCtYQ0EneNnPKB1TSn77CTTd_nPfs3GfdZzvg3yHVm_5F2r9fiIvLnECanHYuXx4d1-6QWy9m8bRYqQV0dFJoOfLetO08A4J2PJTmsy-CsO1JEEwl4y8Z5CxoAOQTUFjsXgcF6HIvb_24KIWz6nL44_2xOx_APZNo_Rg</recordid><startdate>201808</startdate><enddate>201808</enddate><creator>Huang, Kevin</creator><creator>Khalil, Mahmoud</creator><creator>Luciani, Evelyn</creator><creator>Melesse, Daniel</creator><creator>Ning, Taikang</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201808</creationdate><title>A Data-Driven Approach for Gaze Tracking</title><author>Huang, Kevin ; Khalil, Mahmoud ; Luciani, Evelyn ; Melesse, Daniel ; Ning, Taikang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-32b1cbd7bc6111c9cf469374e989469f32f61ada29c810113aee3083b9f1a18c3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2018</creationdate><topic>automatic gaze tracking</topic><topic>Cameras</topic><topic>classification</topic><topic>Cornea</topic><topic>Gaze tracking</topic><topic>human computer interface</topic><topic>k-means clustering</topic><topic>Lenses</topic><topic>Light emitting diodes</topic><topic>Lighting</topic><topic>Monitoring</topic><toplevel>online_resources</toplevel><creatorcontrib>Huang, Kevin</creatorcontrib><creatorcontrib>Khalil, Mahmoud</creatorcontrib><creatorcontrib>Luciani, Evelyn</creatorcontrib><creatorcontrib>Melesse, Daniel</creatorcontrib><creatorcontrib>Ning, Taikang</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Huang, Kevin</au><au>Khalil, Mahmoud</au><au>Luciani, Evelyn</au><au>Melesse, Daniel</au><au>Ning, Taikang</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>A Data-Driven Approach for Gaze Tracking</atitle><btitle>2018 14th IEEE International Conference on Signal Processing (ICSP)</btitle><stitle>ICSP</stitle><date>2018-08</date><risdate>2018</risdate><spage>494</spage><epage>499</epage><pages>494-499</pages><eissn>2164-5221</eissn><eisbn>1538646730</eisbn><eisbn>9781538646717</eisbn><eisbn>9781538646731</eisbn><eisbn>1538646714</eisbn><abstract>Gaze tracking presents an intuitive interface for technology in today's society, with its application focus in controlling electronic devices. This paper concentrates on the design and application of an automatic gaze tracking system utilizing commodity equipment. Compared to preceding low-cost methods, the proposed method is significantly simpler, lowering the barrier of entry for this type of device, and can potentially afford more accurate tracking. Through the careful placement of the infrared (IR) light-emitting-diodes (LEDs) on the monitor and coaxially to the optical axis of the camera, the pupil was illuminated and reference glints became visible on the cornea. These glints were captured by a camera capable of detecting IR light, and were used to determine the users line of sight relative to the monitor. A linear model was used to address the horizontal and vertical components of the glints in the users eye and match them to the corresponding location point on the monitor. K-means clustering was utilized to classify the separate gaze regions with promising results.</abstract><pub>IEEE</pub><doi>10.1109/ICSP.2018.8652292</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 2164-5221 |
ispartof | 2018 14th IEEE International Conference on Signal Processing (ICSP), 2018, p.494-499 |
issn | 2164-5221 |
language | eng |
recordid | cdi_ieee_primary_8652292 |
source | IEEE Xplore All Conference Series |
subjects | automatic gaze tracking Cameras classification Cornea Gaze tracking human computer interface k-means clustering Lenses Light emitting diodes Lighting Monitoring |
title | A Data-Driven Approach for Gaze Tracking |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T06%3A46%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=A%20Data-Driven%20Approach%20for%20Gaze%20Tracking&rft.btitle=2018%2014th%20IEEE%20International%20Conference%20on%20Signal%20Processing%20(ICSP)&rft.au=Huang,%20Kevin&rft.date=2018-08&rft.spage=494&rft.epage=499&rft.pages=494-499&rft.eissn=2164-5221&rft_id=info:doi/10.1109/ICSP.2018.8652292&rft.eisbn=1538646730&rft.eisbn_list=9781538646717&rft.eisbn_list=9781538646731&rft.eisbn_list=1538646714&rft_dat=%3Cieee_CHZPO%3E8652292%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i175t-32b1cbd7bc6111c9cf469374e989469f32f61ada29c810113aee3083b9f1a18c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=8652292&rfr_iscdi=true |