Loading…

Study on emotion recognition bias in different regional groups

Human-machine communication can be substantially enhanced by the inclusion of high-quality real-time recognition of spontaneous human emotional expressions. However, successful recognition of such expressions can be negatively impacted by factors such as sudden variations of lighting, or intentional...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports 2023-05, Vol.13 (1), p.8414-8414, Article 8414
Main Authors: Lukac, Martin, Zhambulova, Gulnaz, Abdiyeva, Kamila, Lewis, Michael
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183
cites cdi_FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183
container_end_page 8414
container_issue 1
container_start_page 8414
container_title Scientific reports
container_volume 13
creator Lukac, Martin
Zhambulova, Gulnaz
Abdiyeva, Kamila
Lewis, Michael
description Human-machine communication can be substantially enhanced by the inclusion of high-quality real-time recognition of spontaneous human emotional expressions. However, successful recognition of such expressions can be negatively impacted by factors such as sudden variations of lighting, or intentional obfuscation. Reliable recognition can be more substantively impeded due to the observation that the presentation and meaning of emotional expressions can vary significantly based on the culture of the expressor and the environment within which the emotions are expressed. As an example, an emotion recognition model trained on a regionally-specific database collected from North America might fail to recognize standard emotional expressions from another region, such as East Asia. To address the problem of regional and cultural bias in emotion recognition from facial expressions, we propose a meta-model that fuses multiple emotional cues and features. The proposed approach integrates image features, action level units, micro-expressions and macro-expressions into a multi-cues emotion model (MCAM). Each of the facial attributes incorporated into the model represents a specific category: fine-grained content-independent features, facial muscle movements, short-term facial expressions and high-level facial expressions. The results of the proposed meta-classifier (MCAM) approach show that a) the successful classification of regional facial expressions is based on non-sympathetic features b) learning the emotional facial expressions of some regional groups can confound the successful recognition of emotional expressions of other regional groups unless it is done from scratch and c) the identification of certain facial cues and features of the data-sets that serve to preclude the design of the perfect unbiased classifier. As a result of these observations we posit that to learn certain regional emotional expressions, other regional expressions first have to be “forgotten”.
doi_str_mv 10.1038/s41598-023-34932-z
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_fed22d348b264ac0a56346f0d9bd079c</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_fed22d348b264ac0a56346f0d9bd079c</doaj_id><sourcerecordid>2818596652</sourcerecordid><originalsourceid>FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183</originalsourceid><addsrcrecordid>eNp9kUtv1TAQhS0EolXpH2CBIrFhk9Yev-INCFXQVqrEgu4tx4_gq9z4YidI7a-ve1P6YIE3M_I5_uzxQeg9wScE0-60MMJV12KgLWWKQnv7Ch0CZrwFCvD6WX-AjkvZ4Lo4KEbUW3RAJQCXXByizz_nxd00aWr8Ns2x1uxtGqa47_toShOnxsUQfPbTXNWhCmZshpyWXXmH3gQzFn_8UI_Q9fdv12cX7dWP88uzr1et5YzMbWCeWOglCOh6GaQlEnNMuKcy2F5RQYQyHQsYrBCCGe-lx9zJwDvnSEeP0OWKdcls9C7Hrck3Opmo9xspD9rkOdrR6-AdgKOs66GCLDZcUCYCdqp3WCpbWV9W1m7pt97ZOlU24wvoS2WKv_SQ_miCASvCWSV8eiDk9HvxZdbbWKwfRzP5tBQNHVEgFZa4Wj_-Y92kJdf_27s6roTgUF2wumxOpWQfHl9DsL5PW69p65q23qetb-uhD8_neDzyN9tqoKuhVGkafH66-z_YO5X7tUQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2818596652</pqid></control><display><type>article</type><title>Study on emotion recognition bias in different regional groups</title><source>PubMed Central(OpenAccess)</source><source>Full-Text Journals in Chemistry (Open access)</source><source>Publicly Available Content (ProQuest)</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Lukac, Martin ; Zhambulova, Gulnaz ; Abdiyeva, Kamila ; Lewis, Michael</creator><creatorcontrib>Lukac, Martin ; Zhambulova, Gulnaz ; Abdiyeva, Kamila ; Lewis, Michael</creatorcontrib><description>Human-machine communication can be substantially enhanced by the inclusion of high-quality real-time recognition of spontaneous human emotional expressions. However, successful recognition of such expressions can be negatively impacted by factors such as sudden variations of lighting, or intentional obfuscation. Reliable recognition can be more substantively impeded due to the observation that the presentation and meaning of emotional expressions can vary significantly based on the culture of the expressor and the environment within which the emotions are expressed. As an example, an emotion recognition model trained on a regionally-specific database collected from North America might fail to recognize standard emotional expressions from another region, such as East Asia. To address the problem of regional and cultural bias in emotion recognition from facial expressions, we propose a meta-model that fuses multiple emotional cues and features. The proposed approach integrates image features, action level units, micro-expressions and macro-expressions into a multi-cues emotion model (MCAM). Each of the facial attributes incorporated into the model represents a specific category: fine-grained content-independent features, facial muscle movements, short-term facial expressions and high-level facial expressions. The results of the proposed meta-classifier (MCAM) approach show that a) the successful classification of regional facial expressions is based on non-sympathetic features b) learning the emotional facial expressions of some regional groups can confound the successful recognition of emotional expressions of other regional groups unless it is done from scratch and c) the identification of certain facial cues and features of the data-sets that serve to preclude the design of the perfect unbiased classifier. As a result of these observations we posit that to learn certain regional emotional expressions, other regional expressions first have to be “forgotten”.</description><identifier>ISSN: 2045-2322</identifier><identifier>EISSN: 2045-2322</identifier><identifier>DOI: 10.1038/s41598-023-34932-z</identifier><identifier>PMID: 37225756</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>631/114 ; 631/477 ; 631/477/2811 ; Asia, Eastern ; Bias ; Communication ; Cues ; Emotions ; Humanities and Social Sciences ; Humans ; multidisciplinary ; Pattern recognition ; Science ; Science (multidisciplinary)</subject><ispartof>Scientific reports, 2023-05, Vol.13 (1), p.8414-8414, Article 8414</ispartof><rights>The Author(s) 2023. corrected publication 2023</rights><rights>2023. The Author(s).</rights><rights>The Author(s) 2023. corrected publication 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>The Author(s) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183</citedby><cites>FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2818596652/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2818596652?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,25732,27903,27904,36991,36992,44569,53770,53772,74873</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37225756$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Lukac, Martin</creatorcontrib><creatorcontrib>Zhambulova, Gulnaz</creatorcontrib><creatorcontrib>Abdiyeva, Kamila</creatorcontrib><creatorcontrib>Lewis, Michael</creatorcontrib><title>Study on emotion recognition bias in different regional groups</title><title>Scientific reports</title><addtitle>Sci Rep</addtitle><addtitle>Sci Rep</addtitle><description>Human-machine communication can be substantially enhanced by the inclusion of high-quality real-time recognition of spontaneous human emotional expressions. However, successful recognition of such expressions can be negatively impacted by factors such as sudden variations of lighting, or intentional obfuscation. Reliable recognition can be more substantively impeded due to the observation that the presentation and meaning of emotional expressions can vary significantly based on the culture of the expressor and the environment within which the emotions are expressed. As an example, an emotion recognition model trained on a regionally-specific database collected from North America might fail to recognize standard emotional expressions from another region, such as East Asia. To address the problem of regional and cultural bias in emotion recognition from facial expressions, we propose a meta-model that fuses multiple emotional cues and features. The proposed approach integrates image features, action level units, micro-expressions and macro-expressions into a multi-cues emotion model (MCAM). Each of the facial attributes incorporated into the model represents a specific category: fine-grained content-independent features, facial muscle movements, short-term facial expressions and high-level facial expressions. The results of the proposed meta-classifier (MCAM) approach show that a) the successful classification of regional facial expressions is based on non-sympathetic features b) learning the emotional facial expressions of some regional groups can confound the successful recognition of emotional expressions of other regional groups unless it is done from scratch and c) the identification of certain facial cues and features of the data-sets that serve to preclude the design of the perfect unbiased classifier. As a result of these observations we posit that to learn certain regional emotional expressions, other regional expressions first have to be “forgotten”.</description><subject>631/114</subject><subject>631/477</subject><subject>631/477/2811</subject><subject>Asia, Eastern</subject><subject>Bias</subject><subject>Communication</subject><subject>Cues</subject><subject>Emotions</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>multidisciplinary</subject><subject>Pattern recognition</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><issn>2045-2322</issn><issn>2045-2322</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9kUtv1TAQhS0EolXpH2CBIrFhk9Yev-INCFXQVqrEgu4tx4_gq9z4YidI7a-ve1P6YIE3M_I5_uzxQeg9wScE0-60MMJV12KgLWWKQnv7Ch0CZrwFCvD6WX-AjkvZ4Lo4KEbUW3RAJQCXXByizz_nxd00aWr8Ns2x1uxtGqa47_toShOnxsUQfPbTXNWhCmZshpyWXXmH3gQzFn_8UI_Q9fdv12cX7dWP88uzr1et5YzMbWCeWOglCOh6GaQlEnNMuKcy2F5RQYQyHQsYrBCCGe-lx9zJwDvnSEeP0OWKdcls9C7Hrck3Opmo9xspD9rkOdrR6-AdgKOs66GCLDZcUCYCdqp3WCpbWV9W1m7pt97ZOlU24wvoS2WKv_SQ_miCASvCWSV8eiDk9HvxZdbbWKwfRzP5tBQNHVEgFZa4Wj_-Y92kJdf_27s6roTgUF2wumxOpWQfHl9DsL5PW69p65q23qetb-uhD8_neDzyN9tqoKuhVGkafH66-z_YO5X7tUQ</recordid><startdate>20230524</startdate><enddate>20230524</enddate><creator>Lukac, Martin</creator><creator>Zhambulova, Gulnaz</creator><creator>Abdiyeva, Kamila</creator><creator>Lewis, Michael</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88I</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20230524</creationdate><title>Study on emotion recognition bias in different regional groups</title><author>Lukac, Martin ; Zhambulova, Gulnaz ; Abdiyeva, Kamila ; Lewis, Michael</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>631/114</topic><topic>631/477</topic><topic>631/477/2811</topic><topic>Asia, Eastern</topic><topic>Bias</topic><topic>Communication</topic><topic>Cues</topic><topic>Emotions</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>multidisciplinary</topic><topic>Pattern recognition</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lukac, Martin</creatorcontrib><creatorcontrib>Zhambulova, Gulnaz</creatorcontrib><creatorcontrib>Abdiyeva, Kamila</creatorcontrib><creatorcontrib>Lewis, Michael</creatorcontrib><collection>SpringerOpen</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection (Proquest)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>ProQuest Science Journals</collection><collection>ProQuest Biological Science Journals</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>Scientific reports</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lukac, Martin</au><au>Zhambulova, Gulnaz</au><au>Abdiyeva, Kamila</au><au>Lewis, Michael</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Study on emotion recognition bias in different regional groups</atitle><jtitle>Scientific reports</jtitle><stitle>Sci Rep</stitle><addtitle>Sci Rep</addtitle><date>2023-05-24</date><risdate>2023</risdate><volume>13</volume><issue>1</issue><spage>8414</spage><epage>8414</epage><pages>8414-8414</pages><artnum>8414</artnum><issn>2045-2322</issn><eissn>2045-2322</eissn><abstract>Human-machine communication can be substantially enhanced by the inclusion of high-quality real-time recognition of spontaneous human emotional expressions. However, successful recognition of such expressions can be negatively impacted by factors such as sudden variations of lighting, or intentional obfuscation. Reliable recognition can be more substantively impeded due to the observation that the presentation and meaning of emotional expressions can vary significantly based on the culture of the expressor and the environment within which the emotions are expressed. As an example, an emotion recognition model trained on a regionally-specific database collected from North America might fail to recognize standard emotional expressions from another region, such as East Asia. To address the problem of regional and cultural bias in emotion recognition from facial expressions, we propose a meta-model that fuses multiple emotional cues and features. The proposed approach integrates image features, action level units, micro-expressions and macro-expressions into a multi-cues emotion model (MCAM). Each of the facial attributes incorporated into the model represents a specific category: fine-grained content-independent features, facial muscle movements, short-term facial expressions and high-level facial expressions. The results of the proposed meta-classifier (MCAM) approach show that a) the successful classification of regional facial expressions is based on non-sympathetic features b) learning the emotional facial expressions of some regional groups can confound the successful recognition of emotional expressions of other regional groups unless it is done from scratch and c) the identification of certain facial cues and features of the data-sets that serve to preclude the design of the perfect unbiased classifier. As a result of these observations we posit that to learn certain regional emotional expressions, other regional expressions first have to be “forgotten”.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>37225756</pmid><doi>10.1038/s41598-023-34932-z</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2045-2322
ispartof Scientific reports, 2023-05, Vol.13 (1), p.8414-8414, Article 8414
issn 2045-2322
2045-2322
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_fed22d348b264ac0a56346f0d9bd079c
source PubMed Central(OpenAccess); Full-Text Journals in Chemistry (Open access); Publicly Available Content (ProQuest); Springer Nature - nature.com Journals - Fully Open Access
subjects 631/114
631/477
631/477/2811
Asia, Eastern
Bias
Communication
Cues
Emotions
Humanities and Social Sciences
Humans
multidisciplinary
Pattern recognition
Science
Science (multidisciplinary)
title Study on emotion recognition bias in different regional groups
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T22%3A44%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Study%20on%20emotion%20recognition%20bias%20in%20different%20regional%20groups&rft.jtitle=Scientific%20reports&rft.au=Lukac,%20Martin&rft.date=2023-05-24&rft.volume=13&rft.issue=1&rft.spage=8414&rft.epage=8414&rft.pages=8414-8414&rft.artnum=8414&rft.issn=2045-2322&rft.eissn=2045-2322&rft_id=info:doi/10.1038/s41598-023-34932-z&rft_dat=%3Cproquest_doaj_%3E2818596652%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c541t-f4e1c2b72628b7f7c1705015e37fcb936169a84f02c6664aee7e05d7f58dd183%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2818596652&rft_id=info:pmid/37225756&rfr_iscdi=true