Loading…

An event-related potential comparison of facial expression processing between cartoon and real faces

Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with...

Full description

Saved in:
Bibliographic Details
Published in:PloS one 2019-01, Vol.14 (1), p.e0198868-e0198868
Main Authors: Zhao, Jiayin, Meng, Qi, An, Licong, Wang, Yifang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983
cites cdi_FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983
container_end_page e0198868
container_issue 1
container_start_page e0198868
container_title PloS one
container_volume 14
creator Zhao, Jiayin
Meng, Qi
An, Licong
Wang, Yifang
description Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.
doi_str_mv 10.1371/journal.pone.0198868
format article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2166102251</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A569062785</galeid><doaj_id>oai_doaj_org_article_149f7901938a40ccb36c225a40a6f55a</doaj_id><sourcerecordid>A569062785</sourcerecordid><originalsourceid>FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983</originalsourceid><addsrcrecordid>eNqNk12L1DAUhoso7of-A9GCsOjFjPlo0_RGGBY_BhYWdPE2pGkyk6GT1CRd13_vqdNdprIX0oumJ895k_P2nCx7hdES0wp_2PkhONkte-_0EuGac8afZKe4pmTBCKJPj9Yn2VmMO4RKyhl7np1QxEhdcnKatSuX61vt0iLoTibd5r1P8Glllyu_72Ww0bvcm9xINQb1XR90jBaCffBqXLpN3uj0S2uXKxmShy3p2jxowCFLxxfZMyO7qF9O7_Ps5vOnm8uvi6vrL-vL1dVCsZqkRY2YMcy0hMqyLWqGWkorSUrcFIrwwkjDZKWMUtyUuGAlawgtilZx3TZQPz3P3hxk-85HMfkTBcGMYURAB4j1gWi93Ik-2L0Mv4WXVvwN-LARUIBVnRa4qE1Vg6-UywIp1VCmQAPWkpmylKD1cTptaPa6VWBakN1MdL7j7FZs_K1glHCCxsu8mwSC_znomMTeRqW7Tjrth_HeVU0pgAjQt_-gj1c3URsJBVhnPJyrRlGxKhm4SypeArV8hIKn1XuroJmMhfgs4f0sAZik79JGDjGK9fdv_89e_5izF0fsFrolbaPvhgS9FedgcQBV8DEGbR5MxkiMs3DvhhhnQUyzAGmvj3_QQ9J989M_KdUErA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2166102251</pqid></control><display><type>article</type><title>An event-related potential comparison of facial expression processing between cartoon and real faces</title><source>Access via ProQuest (Open Access)</source><source>PubMed Central</source><creator>Zhao, Jiayin ; Meng, Qi ; An, Licong ; Wang, Yifang</creator><contributor>Olino, Thomas M.</contributor><creatorcontrib>Zhao, Jiayin ; Meng, Qi ; An, Licong ; Wang, Yifang ; Olino, Thomas M.</creatorcontrib><description>Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0198868</identifier><identifier>PMID: 30629582</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Adult ; Amplitudes ; Analysis ; Attention - physiology ; Biology and Life Sciences ; Brain ; Brain research ; Cognition &amp; reasoning ; Colleges &amp; universities ; Communications ; Dependent variables ; Emotions ; Emotions - physiology ; Event-related potentials ; Evoked Potentials - physiology ; Face recognition ; Facial Expression ; Facial Recognition - physiology ; Female ; Females ; Gender differences ; Gene expression ; Hemispheric laterality ; Humans ; Independent variables ; Laboratories ; Latency ; Male ; Males ; Medicine and Health Sciences ; Pattern recognition ; Psychology ; Reaction time ; Reaction Time - physiology ; Research and Analysis Methods ; Social Sciences</subject><ispartof>PloS one, 2019-01, Vol.14 (1), p.e0198868-e0198868</ispartof><rights>COPYRIGHT 2019 Public Library of Science</rights><rights>2019 Zhao et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2019 Zhao et al 2019 Zhao et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983</citedby><cites>FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983</cites><orcidid>0000-0002-8404-9906</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2166102251/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2166102251?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30629582$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Olino, Thomas M.</contributor><creatorcontrib>Zhao, Jiayin</creatorcontrib><creatorcontrib>Meng, Qi</creatorcontrib><creatorcontrib>An, Licong</creatorcontrib><creatorcontrib>Wang, Yifang</creatorcontrib><title>An event-related potential comparison of facial expression processing between cartoon and real faces</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.</description><subject>Adult</subject><subject>Amplitudes</subject><subject>Analysis</subject><subject>Attention - physiology</subject><subject>Biology and Life Sciences</subject><subject>Brain</subject><subject>Brain research</subject><subject>Cognition &amp; reasoning</subject><subject>Colleges &amp; universities</subject><subject>Communications</subject><subject>Dependent variables</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Event-related potentials</subject><subject>Evoked Potentials - physiology</subject><subject>Face recognition</subject><subject>Facial Expression</subject><subject>Facial Recognition - physiology</subject><subject>Female</subject><subject>Females</subject><subject>Gender differences</subject><subject>Gene expression</subject><subject>Hemispheric laterality</subject><subject>Humans</subject><subject>Independent variables</subject><subject>Laboratories</subject><subject>Latency</subject><subject>Male</subject><subject>Males</subject><subject>Medicine and Health Sciences</subject><subject>Pattern recognition</subject><subject>Psychology</subject><subject>Reaction time</subject><subject>Reaction Time - physiology</subject><subject>Research and Analysis Methods</subject><subject>Social Sciences</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNqNk12L1DAUhoso7of-A9GCsOjFjPlo0_RGGBY_BhYWdPE2pGkyk6GT1CRd13_vqdNdprIX0oumJ895k_P2nCx7hdES0wp_2PkhONkte-_0EuGac8afZKe4pmTBCKJPj9Yn2VmMO4RKyhl7np1QxEhdcnKatSuX61vt0iLoTibd5r1P8Glllyu_72Ww0bvcm9xINQb1XR90jBaCffBqXLpN3uj0S2uXKxmShy3p2jxowCFLxxfZMyO7qF9O7_Ps5vOnm8uvi6vrL-vL1dVCsZqkRY2YMcy0hMqyLWqGWkorSUrcFIrwwkjDZKWMUtyUuGAlawgtilZx3TZQPz3P3hxk-85HMfkTBcGMYURAB4j1gWi93Ik-2L0Mv4WXVvwN-LARUIBVnRa4qE1Vg6-UywIp1VCmQAPWkpmylKD1cTptaPa6VWBakN1MdL7j7FZs_K1glHCCxsu8mwSC_znomMTeRqW7Tjrth_HeVU0pgAjQt_-gj1c3URsJBVhnPJyrRlGxKhm4SypeArV8hIKn1XuroJmMhfgs4f0sAZik79JGDjGK9fdv_89e_5izF0fsFrolbaPvhgS9FedgcQBV8DEGbR5MxkiMs3DvhhhnQUyzAGmvj3_QQ9J989M_KdUErA</recordid><startdate>20190110</startdate><enddate>20190110</enddate><creator>Zhao, Jiayin</creator><creator>Meng, Qi</creator><creator>An, Licong</creator><creator>Wang, Yifang</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-8404-9906</orcidid></search><sort><creationdate>20190110</creationdate><title>An event-related potential comparison of facial expression processing between cartoon and real faces</title><author>Zhao, Jiayin ; Meng, Qi ; An, Licong ; Wang, Yifang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Adult</topic><topic>Amplitudes</topic><topic>Analysis</topic><topic>Attention - physiology</topic><topic>Biology and Life Sciences</topic><topic>Brain</topic><topic>Brain research</topic><topic>Cognition &amp; reasoning</topic><topic>Colleges &amp; universities</topic><topic>Communications</topic><topic>Dependent variables</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Event-related potentials</topic><topic>Evoked Potentials - physiology</topic><topic>Face recognition</topic><topic>Facial Expression</topic><topic>Facial Recognition - physiology</topic><topic>Female</topic><topic>Females</topic><topic>Gender differences</topic><topic>Gene expression</topic><topic>Hemispheric laterality</topic><topic>Humans</topic><topic>Independent variables</topic><topic>Laboratories</topic><topic>Latency</topic><topic>Male</topic><topic>Males</topic><topic>Medicine and Health Sciences</topic><topic>Pattern recognition</topic><topic>Psychology</topic><topic>Reaction time</topic><topic>Reaction Time - physiology</topic><topic>Research and Analysis Methods</topic><topic>Social Sciences</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhao, Jiayin</creatorcontrib><creatorcontrib>Meng, Qi</creatorcontrib><creatorcontrib>An, Licong</creatorcontrib><creatorcontrib>Wang, Yifang</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhao, Jiayin</au><au>Meng, Qi</au><au>An, Licong</au><au>Wang, Yifang</au><au>Olino, Thomas M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An event-related potential comparison of facial expression processing between cartoon and real faces</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2019-01-10</date><risdate>2019</risdate><volume>14</volume><issue>1</issue><spage>e0198868</spage><epage>e0198868</epage><pages>e0198868-e0198868</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>30629582</pmid><doi>10.1371/journal.pone.0198868</doi><tpages>e0198868</tpages><orcidid>https://orcid.org/0000-0002-8404-9906</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2019-01, Vol.14 (1), p.e0198868-e0198868
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_2166102251
source Access via ProQuest (Open Access); PubMed Central
subjects Adult
Amplitudes
Analysis
Attention - physiology
Biology and Life Sciences
Brain
Brain research
Cognition & reasoning
Colleges & universities
Communications
Dependent variables
Emotions
Emotions - physiology
Event-related potentials
Evoked Potentials - physiology
Face recognition
Facial Expression
Facial Recognition - physiology
Female
Females
Gender differences
Gene expression
Hemispheric laterality
Humans
Independent variables
Laboratories
Latency
Male
Males
Medicine and Health Sciences
Pattern recognition
Psychology
Reaction time
Reaction Time - physiology
Research and Analysis Methods
Social Sciences
title An event-related potential comparison of facial expression processing between cartoon and real faces
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T22%3A34%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20event-related%20potential%20comparison%20of%20facial%20expression%20processing%20between%20cartoon%20and%20real%20faces&rft.jtitle=PloS%20one&rft.au=Zhao,%20Jiayin&rft.date=2019-01-10&rft.volume=14&rft.issue=1&rft.spage=e0198868&rft.epage=e0198868&rft.pages=e0198868-e0198868&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0198868&rft_dat=%3Cgale_plos_%3EA569062785%3C/gale_plos_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c692t-906ff6fd23a5d4960d337a251b4c284faf6a7cfcc8f514656b2344dc8edb1983%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2166102251&rft_id=info:pmid/30629582&rft_galeid=A569062785&rfr_iscdi=true