Loading…

Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests

Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our...

Full description

Saved in:
Bibliographic Details
Published in:IEEE journal of biomedical and health informatics 2020-11, Vol.24 (11), p.3066-3075
Main Authors: Mengoudi, Kyriaki, Ravi, Daniele, Yong, Keir X. X., Primativo, Silvia, Pavisic, Ivanna M., Brotherhood, Emilie, Lu, Kirsty, Schott, Jonathan M., Crutch, Sebastian J., Alexander, Daniel C.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63
cites cdi_FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63
container_end_page 3075
container_issue 11
container_start_page 3066
container_title IEEE journal of biomedical and health informatics
container_volume 24
creator Mengoudi, Kyriaki
Ravi, Daniele
Yong, Keir X. X.
Primativo, Silvia
Pavisic, Ivanna M.
Brotherhood, Emilie
Lu, Kirsty
Schott, Jonathan M.
Crutch, Sebastian J.
Alexander, Daniel C.
description Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our approach is based on self-supervised representation learning where, by training initially a deep neural network to solve a pretext task using well-defined available labels (e.g. recognising distinct cognitive activities in healthy individuals), the network encodes high-level semantic information which is useful for solving other problems of interest (e.g. dementia classification). Inspired by previous work in explainable AI, we use the Layer-wise Relevance Propagation (LRP) technique to describe our network's decisions in differentiating between the distinct cognitive activities. The extent to which eye-tracking features of dementia patients deviate from healthy behaviour is then explored, followed by a comparison between self-supervised and handcrafted representations on discriminating between participants with and without dementia. Our findings not only reveal novel self-supervised learning features that are more sensitive than handcrafted features in detecting performance differences between participants with and without dementia across a variety of tasks, but also validate that instruction-less eye-tracking tests can detect oculomotor biomarkers of dementia-related cognitive dysfunction. This work highlights the contribution of self-supervised representation learning techniques in biomedical applications where the small number of patients, the non-homogenous presentations of the disease and the complexity of the setting can be a challenge using state-of-the-art feature extraction methods.
doi_str_mv 10.1109/JBHI.2020.3004686
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_9124654</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9124654</ieee_id><sourcerecordid>2457972263</sourcerecordid><originalsourceid>FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63</originalsourceid><addsrcrecordid>eNpdkFtLw0AQhRdRrNT-ABEk4IsvqXvLXh5rrbZSEKHi45LLpG5tkppNhP57N_by4LzMMOebs8tB6IrgISFY3788TGdDiikeMoy5UOIEXVAiVEgpVqeHmWjeQwPnVtiX8istzlGPUcm1lvICvY3aZQFlY8tl8Ah_UxyMq2VpG_sDwcg5cK5bBx-2-QxmpWvqNm1sVYZzrwSTLYSLOk6_OoMFuMZdorM8XjsY7HsfvT9NFuNpOH99no1H8zBlmjahznAkMdNCJlECiYwhElxgpiAHyXPoRKEgUiRJOKhc5XGGU6GwJpHIMsH66G7nu6mr79a_bArrUliv4xKq1hnKGRaCKqI9evsPXVVtXfrfeSqSWlIqmKfIjkrryrkacrOpbRHXW0Ow6SI3XeSmi9zsI_c3N3vnNikgO14cAvbA9Q6wAHCUNaFcRJz9AoIHg84</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2457972263</pqid></control><display><type>article</type><title>Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Mengoudi, Kyriaki ; Ravi, Daniele ; Yong, Keir X. X. ; Primativo, Silvia ; Pavisic, Ivanna M. ; Brotherhood, Emilie ; Lu, Kirsty ; Schott, Jonathan M. ; Crutch, Sebastian J. ; Alexander, Daniel C.</creator><creatorcontrib>Mengoudi, Kyriaki ; Ravi, Daniele ; Yong, Keir X. X. ; Primativo, Silvia ; Pavisic, Ivanna M. ; Brotherhood, Emilie ; Lu, Kirsty ; Schott, Jonathan M. ; Crutch, Sebastian J. ; Alexander, Daniel C.</creatorcontrib><description>Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our approach is based on self-supervised representation learning where, by training initially a deep neural network to solve a pretext task using well-defined available labels (e.g. recognising distinct cognitive activities in healthy individuals), the network encodes high-level semantic information which is useful for solving other problems of interest (e.g. dementia classification). Inspired by previous work in explainable AI, we use the Layer-wise Relevance Propagation (LRP) technique to describe our network's decisions in differentiating between the distinct cognitive activities. The extent to which eye-tracking features of dementia patients deviate from healthy behaviour is then explored, followed by a comparison between self-supervised and handcrafted representations on discriminating between participants with and without dementia. Our findings not only reveal novel self-supervised learning features that are more sensitive than handcrafted features in detecting performance differences between participants with and without dementia across a variety of tasks, but also validate that instruction-less eye-tracking tests can detect oculomotor biomarkers of dementia-related cognitive dysfunction. This work highlights the contribution of self-supervised representation learning techniques in biomedical applications where the small number of patients, the non-homogenous presentations of the disease and the complexity of the setting can be a challenge using state-of-the-art feature extraction methods.</description><identifier>ISSN: 2168-2194</identifier><identifier>EISSN: 2168-2208</identifier><identifier>DOI: 10.1109/JBHI.2020.3004686</identifier><identifier>PMID: 32749977</identifier><identifier>CODEN: IJBHA9</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Artificial neural networks ; Biomarkers ; Biomedical materials ; Cognition ; Cognitive ability ; Deep learning ; Dementia ; Dementia disorders ; Eye movements ; Eye-tracking ; Feature extraction ; Learning ; Machine learning ; Neural networks ; Pupils ; representation learning ; Representations ; Semantics ; Task analysis ; Tracking</subject><ispartof>IEEE journal of biomedical and health informatics, 2020-11, Vol.24 (11), p.3066-3075</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63</citedby><cites>FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63</cites><orcidid>0000-0001-9632-850X ; 0000-0002-4891-9847 ; 0000-0003-0372-2677 ; 0000-0002-8416-2183 ; 0000-0002-3192-6274</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9124654$$EHTML$$P50$$Gieee$$H</linktohtml><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32749977$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Mengoudi, Kyriaki</creatorcontrib><creatorcontrib>Ravi, Daniele</creatorcontrib><creatorcontrib>Yong, Keir X. X.</creatorcontrib><creatorcontrib>Primativo, Silvia</creatorcontrib><creatorcontrib>Pavisic, Ivanna M.</creatorcontrib><creatorcontrib>Brotherhood, Emilie</creatorcontrib><creatorcontrib>Lu, Kirsty</creatorcontrib><creatorcontrib>Schott, Jonathan M.</creatorcontrib><creatorcontrib>Crutch, Sebastian J.</creatorcontrib><creatorcontrib>Alexander, Daniel C.</creatorcontrib><title>Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests</title><title>IEEE journal of biomedical and health informatics</title><addtitle>JBHI</addtitle><addtitle>IEEE J Biomed Health Inform</addtitle><description>Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our approach is based on self-supervised representation learning where, by training initially a deep neural network to solve a pretext task using well-defined available labels (e.g. recognising distinct cognitive activities in healthy individuals), the network encodes high-level semantic information which is useful for solving other problems of interest (e.g. dementia classification). Inspired by previous work in explainable AI, we use the Layer-wise Relevance Propagation (LRP) technique to describe our network's decisions in differentiating between the distinct cognitive activities. The extent to which eye-tracking features of dementia patients deviate from healthy behaviour is then explored, followed by a comparison between self-supervised and handcrafted representations on discriminating between participants with and without dementia. Our findings not only reveal novel self-supervised learning features that are more sensitive than handcrafted features in detecting performance differences between participants with and without dementia across a variety of tasks, but also validate that instruction-less eye-tracking tests can detect oculomotor biomarkers of dementia-related cognitive dysfunction. This work highlights the contribution of self-supervised representation learning techniques in biomedical applications where the small number of patients, the non-homogenous presentations of the disease and the complexity of the setting can be a challenge using state-of-the-art feature extraction methods.</description><subject>Artificial neural networks</subject><subject>Biomarkers</subject><subject>Biomedical materials</subject><subject>Cognition</subject><subject>Cognitive ability</subject><subject>Deep learning</subject><subject>Dementia</subject><subject>Dementia disorders</subject><subject>Eye movements</subject><subject>Eye-tracking</subject><subject>Feature extraction</subject><subject>Learning</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Pupils</subject><subject>representation learning</subject><subject>Representations</subject><subject>Semantics</subject><subject>Task analysis</subject><subject>Tracking</subject><issn>2168-2194</issn><issn>2168-2208</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNpdkFtLw0AQhRdRrNT-ABEk4IsvqXvLXh5rrbZSEKHi45LLpG5tkppNhP57N_by4LzMMOebs8tB6IrgISFY3788TGdDiikeMoy5UOIEXVAiVEgpVqeHmWjeQwPnVtiX8istzlGPUcm1lvICvY3aZQFlY8tl8Ah_UxyMq2VpG_sDwcg5cK5bBx-2-QxmpWvqNm1sVYZzrwSTLYSLOk6_OoMFuMZdorM8XjsY7HsfvT9NFuNpOH99no1H8zBlmjahznAkMdNCJlECiYwhElxgpiAHyXPoRKEgUiRJOKhc5XGGU6GwJpHIMsH66G7nu6mr79a_bArrUliv4xKq1hnKGRaCKqI9evsPXVVtXfrfeSqSWlIqmKfIjkrryrkacrOpbRHXW0Ow6SI3XeSmi9zsI_c3N3vnNikgO14cAvbA9Q6wAHCUNaFcRJz9AoIHg84</recordid><startdate>20201101</startdate><enddate>20201101</enddate><creator>Mengoudi, Kyriaki</creator><creator>Ravi, Daniele</creator><creator>Yong, Keir X. X.</creator><creator>Primativo, Silvia</creator><creator>Pavisic, Ivanna M.</creator><creator>Brotherhood, Emilie</creator><creator>Lu, Kirsty</creator><creator>Schott, Jonathan M.</creator><creator>Crutch, Sebastian J.</creator><creator>Alexander, Daniel C.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>K9.</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-9632-850X</orcidid><orcidid>https://orcid.org/0000-0002-4891-9847</orcidid><orcidid>https://orcid.org/0000-0003-0372-2677</orcidid><orcidid>https://orcid.org/0000-0002-8416-2183</orcidid><orcidid>https://orcid.org/0000-0002-3192-6274</orcidid></search><sort><creationdate>20201101</creationdate><title>Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests</title><author>Mengoudi, Kyriaki ; Ravi, Daniele ; Yong, Keir X. X. ; Primativo, Silvia ; Pavisic, Ivanna M. ; Brotherhood, Emilie ; Lu, Kirsty ; Schott, Jonathan M. ; Crutch, Sebastian J. ; Alexander, Daniel C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Artificial neural networks</topic><topic>Biomarkers</topic><topic>Biomedical materials</topic><topic>Cognition</topic><topic>Cognitive ability</topic><topic>Deep learning</topic><topic>Dementia</topic><topic>Dementia disorders</topic><topic>Eye movements</topic><topic>Eye-tracking</topic><topic>Feature extraction</topic><topic>Learning</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Pupils</topic><topic>representation learning</topic><topic>Representations</topic><topic>Semantics</topic><topic>Task analysis</topic><topic>Tracking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mengoudi, Kyriaki</creatorcontrib><creatorcontrib>Ravi, Daniele</creatorcontrib><creatorcontrib>Yong, Keir X. X.</creatorcontrib><creatorcontrib>Primativo, Silvia</creatorcontrib><creatorcontrib>Pavisic, Ivanna M.</creatorcontrib><creatorcontrib>Brotherhood, Emilie</creatorcontrib><creatorcontrib>Lu, Kirsty</creatorcontrib><creatorcontrib>Schott, Jonathan M.</creatorcontrib><creatorcontrib>Crutch, Sebastian J.</creatorcontrib><creatorcontrib>Alexander, Daniel C.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE journal of biomedical and health informatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mengoudi, Kyriaki</au><au>Ravi, Daniele</au><au>Yong, Keir X. X.</au><au>Primativo, Silvia</au><au>Pavisic, Ivanna M.</au><au>Brotherhood, Emilie</au><au>Lu, Kirsty</au><au>Schott, Jonathan M.</au><au>Crutch, Sebastian J.</au><au>Alexander, Daniel C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests</atitle><jtitle>IEEE journal of biomedical and health informatics</jtitle><stitle>JBHI</stitle><addtitle>IEEE J Biomed Health Inform</addtitle><date>2020-11-01</date><risdate>2020</risdate><volume>24</volume><issue>11</issue><spage>3066</spage><epage>3075</epage><pages>3066-3075</pages><issn>2168-2194</issn><eissn>2168-2208</eissn><coden>IJBHA9</coden><abstract>Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our approach is based on self-supervised representation learning where, by training initially a deep neural network to solve a pretext task using well-defined available labels (e.g. recognising distinct cognitive activities in healthy individuals), the network encodes high-level semantic information which is useful for solving other problems of interest (e.g. dementia classification). Inspired by previous work in explainable AI, we use the Layer-wise Relevance Propagation (LRP) technique to describe our network's decisions in differentiating between the distinct cognitive activities. The extent to which eye-tracking features of dementia patients deviate from healthy behaviour is then explored, followed by a comparison between self-supervised and handcrafted representations on discriminating between participants with and without dementia. Our findings not only reveal novel self-supervised learning features that are more sensitive than handcrafted features in detecting performance differences between participants with and without dementia across a variety of tasks, but also validate that instruction-less eye-tracking tests can detect oculomotor biomarkers of dementia-related cognitive dysfunction. This work highlights the contribution of self-supervised representation learning techniques in biomedical applications where the small number of patients, the non-homogenous presentations of the disease and the complexity of the setting can be a challenge using state-of-the-art feature extraction methods.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>32749977</pmid><doi>10.1109/JBHI.2020.3004686</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0001-9632-850X</orcidid><orcidid>https://orcid.org/0000-0002-4891-9847</orcidid><orcidid>https://orcid.org/0000-0003-0372-2677</orcidid><orcidid>https://orcid.org/0000-0002-8416-2183</orcidid><orcidid>https://orcid.org/0000-0002-3192-6274</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2168-2194
ispartof IEEE journal of biomedical and health informatics, 2020-11, Vol.24 (11), p.3066-3075
issn 2168-2194
2168-2208
language eng
recordid cdi_ieee_primary_9124654
source IEEE Electronic Library (IEL) Journals
subjects Artificial neural networks
Biomarkers
Biomedical materials
Cognition
Cognitive ability
Deep learning
Dementia
Dementia disorders
Eye movements
Eye-tracking
Feature extraction
Learning
Machine learning
Neural networks
Pupils
representation learning
Representations
Semantics
Task analysis
Tracking
title Augmenting Dementia Cognitive Assessment With Instruction-Less Eye-Tracking Tests
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-03-09T12%3A24%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Augmenting%20Dementia%20Cognitive%20Assessment%20With%20Instruction-Less%20Eye-Tracking%20Tests&rft.jtitle=IEEE%20journal%20of%20biomedical%20and%20health%20informatics&rft.au=Mengoudi,%20Kyriaki&rft.date=2020-11-01&rft.volume=24&rft.issue=11&rft.spage=3066&rft.epage=3075&rft.pages=3066-3075&rft.issn=2168-2194&rft.eissn=2168-2208&rft.coden=IJBHA9&rft_id=info:doi/10.1109/JBHI.2020.3004686&rft_dat=%3Cproquest_ieee_%3E2457972263%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c392t-9d05703967b5beb7ae5646038efe74fe570368e581bb4e8f8fad0c6809156dd63%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2457972263&rft_id=info:pmid/32749977&rft_ieee_id=9124654&rfr_iscdi=true