Loading…

A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism

We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convol...

Full description

Saved in:
Bibliographic Details
Published in:Neuropsychologia 2019-06, Vol.129, p.397-406
Main Authors: Król, Magdalena Ewa, Król, Michał
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563
cites cdi_FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563
container_end_page 406
container_issue
container_start_page 397
container_title Neuropsychologia
container_volume 129
creator Król, Magdalena Ewa
Król, Michał
description We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convolution to a machine-learning emotion recognition algorithm yields an error measure (between actual and predicted emotions) reflecting the quality of extracted information. We recorded the eye-movements of 21 participants with autism and 23 age-, sex- and IQ-matched typically developing participants performing three facial analysis tasks: free-viewing, emotion recognition, and brow-mouth width comparison. In the emotion recognition task, fixations of participants with autism were positioned on lower areas of the faces and were less focused on the eyes compared to the typically developing group. Additionally, the utility of information extracted by them in the emotion recognition task was lower. Thus, the emotion recognition deficit typical in autism can be at least partly traced to the earliest stage of face processing, i.e. to the extraction of visual information via eye-fixations. •When recognising emotions, people with autism looked at faces differently than controls.•Their face scanning patterns were more variable and less focused on the eyes.•Using machine learning we show that it hindered their ability to recognise emotions.•An emotion recognition algorithm was fed parts of face stimuli that participants looked at.•It was less effective when fed only those parts of faces that autistic people fixated on.
doi_str_mv 10.1016/j.neuropsychologia.2019.04.022
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2231937726</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S002839321930106X</els_id><sourcerecordid>2231937726</sourcerecordid><originalsourceid>FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563</originalsourceid><addsrcrecordid>eNqNkc9u1DAQxi0EotvCKyCfEJeEsZ0_mwtSVQFFqsQFzpbjjLuzJPZiJwv7IjwvTrdw4IQ00ozs3_eNRh9jrwWUAkTzdl96XGI4pJPdhTHckykliK6EqgQpn7CN2LaqULWonrINgNwWqlPygl2mtAeAqpbb5-xCCWiFktWG_brmPhxx5JOxO_LIRzTRk7_nxpvxlCjx4DiesJijsd_Wj8HMhkc8ohkTT0sfDjNNZuRHSktu5F2Ik5kpeI4_V9XD6GKYuDOWMpKyYBkpo7kGOtKwrF4_aN5xs8yUphfsmctP-PKxX7GvH95_ubkt7j5__HRzfVdY1cm5aEVr-76yXS3BNoNyqql6UQuHA6ARRlnlnJPQWBCuruRgO2gaFNII6Oq6UVfszdn3EMP3BdOsJ0oWx9F4DEvSUirRqbaVK_rujNoYUoro9CHmu-NJC9BrNnqv_81Gr9loqHTOJhu8ety19BMOf-V_wsjA7RnAfPGRMOpkCb3FgSLaWQ-B_nfXb3Gfr1I</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2231937726</pqid></control><display><type>article</type><title>A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism</title><source>ScienceDirect Journals</source><creator>Król, Magdalena Ewa ; Król, Michał</creator><creatorcontrib>Król, Magdalena Ewa ; Król, Michał</creatorcontrib><description>We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convolution to a machine-learning emotion recognition algorithm yields an error measure (between actual and predicted emotions) reflecting the quality of extracted information. We recorded the eye-movements of 21 participants with autism and 23 age-, sex- and IQ-matched typically developing participants performing three facial analysis tasks: free-viewing, emotion recognition, and brow-mouth width comparison. In the emotion recognition task, fixations of participants with autism were positioned on lower areas of the faces and were less focused on the eyes compared to the typically developing group. Additionally, the utility of information extracted by them in the emotion recognition task was lower. Thus, the emotion recognition deficit typical in autism can be at least partly traced to the earliest stage of face processing, i.e. to the extraction of visual information via eye-fixations. •When recognising emotions, people with autism looked at faces differently than controls.•Their face scanning patterns were more variable and less focused on the eyes.•Using machine learning we show that it hindered their ability to recognise emotions.•An emotion recognition algorithm was fed parts of face stimuli that participants looked at.•It was less effective when fed only those parts of faces that autistic people fixated on.</description><identifier>ISSN: 0028-3932</identifier><identifier>EISSN: 1873-3514</identifier><identifier>DOI: 10.1016/j.neuropsychologia.2019.04.022</identifier><identifier>PMID: 31071324</identifier><language>eng</language><publisher>England: Elsevier Ltd</publisher><subject>Autism spectrum disorder ; Eye-tracking ; Face emotion recognition ; Face processing ; Machine-learning</subject><ispartof>Neuropsychologia, 2019-06, Vol.129, p.397-406</ispartof><rights>2019 Elsevier Ltd</rights><rights>Copyright © 2019 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563</citedby><cites>FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31071324$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Król, Magdalena Ewa</creatorcontrib><creatorcontrib>Król, Michał</creatorcontrib><title>A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism</title><title>Neuropsychologia</title><addtitle>Neuropsychologia</addtitle><description>We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convolution to a machine-learning emotion recognition algorithm yields an error measure (between actual and predicted emotions) reflecting the quality of extracted information. We recorded the eye-movements of 21 participants with autism and 23 age-, sex- and IQ-matched typically developing participants performing three facial analysis tasks: free-viewing, emotion recognition, and brow-mouth width comparison. In the emotion recognition task, fixations of participants with autism were positioned on lower areas of the faces and were less focused on the eyes compared to the typically developing group. Additionally, the utility of information extracted by them in the emotion recognition task was lower. Thus, the emotion recognition deficit typical in autism can be at least partly traced to the earliest stage of face processing, i.e. to the extraction of visual information via eye-fixations. •When recognising emotions, people with autism looked at faces differently than controls.•Their face scanning patterns were more variable and less focused on the eyes.•Using machine learning we show that it hindered their ability to recognise emotions.•An emotion recognition algorithm was fed parts of face stimuli that participants looked at.•It was less effective when fed only those parts of faces that autistic people fixated on.</description><subject>Autism spectrum disorder</subject><subject>Eye-tracking</subject><subject>Face emotion recognition</subject><subject>Face processing</subject><subject>Machine-learning</subject><issn>0028-3932</issn><issn>1873-3514</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNqNkc9u1DAQxi0EotvCKyCfEJeEsZ0_mwtSVQFFqsQFzpbjjLuzJPZiJwv7IjwvTrdw4IQ00ozs3_eNRh9jrwWUAkTzdl96XGI4pJPdhTHckykliK6EqgQpn7CN2LaqULWonrINgNwWqlPygl2mtAeAqpbb5-xCCWiFktWG_brmPhxx5JOxO_LIRzTRk7_nxpvxlCjx4DiesJijsd_Wj8HMhkc8ohkTT0sfDjNNZuRHSktu5F2Ik5kpeI4_V9XD6GKYuDOWMpKyYBkpo7kGOtKwrF4_aN5xs8yUphfsmctP-PKxX7GvH95_ubkt7j5__HRzfVdY1cm5aEVr-76yXS3BNoNyqql6UQuHA6ARRlnlnJPQWBCuruRgO2gaFNII6Oq6UVfszdn3EMP3BdOsJ0oWx9F4DEvSUirRqbaVK_rujNoYUoro9CHmu-NJC9BrNnqv_81Gr9loqHTOJhu8ety19BMOf-V_wsjA7RnAfPGRMOpkCb3FgSLaWQ-B_nfXb3Gfr1I</recordid><startdate>201906</startdate><enddate>201906</enddate><creator>Król, Magdalena Ewa</creator><creator>Król, Michał</creator><general>Elsevier Ltd</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>201906</creationdate><title>A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism</title><author>Król, Magdalena Ewa ; Król, Michał</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Autism spectrum disorder</topic><topic>Eye-tracking</topic><topic>Face emotion recognition</topic><topic>Face processing</topic><topic>Machine-learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Król, Magdalena Ewa</creatorcontrib><creatorcontrib>Król, Michał</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neuropsychologia</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Król, Magdalena Ewa</au><au>Król, Michał</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism</atitle><jtitle>Neuropsychologia</jtitle><addtitle>Neuropsychologia</addtitle><date>2019-06</date><risdate>2019</risdate><volume>129</volume><spage>397</spage><epage>406</epage><pages>397-406</pages><issn>0028-3932</issn><eissn>1873-3514</eissn><abstract>We propose a new method of quantifying the utility of visual information extracted from facial stimuli for emotion recognition. The stimuli are convolved with a Gaussian fixation distribution estimate, revealing more information in those facial regions the participant fixated on. Feeding this convolution to a machine-learning emotion recognition algorithm yields an error measure (between actual and predicted emotions) reflecting the quality of extracted information. We recorded the eye-movements of 21 participants with autism and 23 age-, sex- and IQ-matched typically developing participants performing three facial analysis tasks: free-viewing, emotion recognition, and brow-mouth width comparison. In the emotion recognition task, fixations of participants with autism were positioned on lower areas of the faces and were less focused on the eyes compared to the typically developing group. Additionally, the utility of information extracted by them in the emotion recognition task was lower. Thus, the emotion recognition deficit typical in autism can be at least partly traced to the earliest stage of face processing, i.e. to the extraction of visual information via eye-fixations. •When recognising emotions, people with autism looked at faces differently than controls.•Their face scanning patterns were more variable and less focused on the eyes.•Using machine learning we show that it hindered their ability to recognise emotions.•An emotion recognition algorithm was fed parts of face stimuli that participants looked at.•It was less effective when fed only those parts of faces that autistic people fixated on.</abstract><cop>England</cop><pub>Elsevier Ltd</pub><pmid>31071324</pmid><doi>10.1016/j.neuropsychologia.2019.04.022</doi><tpages>10</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0028-3932
ispartof Neuropsychologia, 2019-06, Vol.129, p.397-406
issn 0028-3932
1873-3514
language eng
recordid cdi_proquest_miscellaneous_2231937726
source ScienceDirect Journals
subjects Autism spectrum disorder
Eye-tracking
Face emotion recognition
Face processing
Machine-learning
title A novel machine learning analysis of eye-tracking data reveals suboptimal visual information extraction from facial stimuli in individuals with autism
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T08%3A57%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20novel%20machine%20learning%20analysis%20of%20eye-tracking%20data%20reveals%20suboptimal%20visual%20information%20extraction%20from%20facial%20stimuli%20in%20individuals%20with%20autism&rft.jtitle=Neuropsychologia&rft.au=Kr%C3%B3l,%20Magdalena%20Ewa&rft.date=2019-06&rft.volume=129&rft.spage=397&rft.epage=406&rft.pages=397-406&rft.issn=0028-3932&rft.eissn=1873-3514&rft_id=info:doi/10.1016/j.neuropsychologia.2019.04.022&rft_dat=%3Cproquest_cross%3E2231937726%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c392t-717cbb4c9520c6d3f364b151fed0ea1a3c3fff206c01f542dc9066e12a1095563%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2231937726&rft_id=info:pmid/31071324&rfr_iscdi=true