Loading…
Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions
Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans’ interpretation of and reaction to various emotions. However, scientific research on emot...
Saved in:
Published in: | Behavior research methods 2023-08, Vol.55 (5), p.2559-2574 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3 |
---|---|
cites | cdi_FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3 |
container_end_page | 2574 |
container_issue | 5 |
container_start_page | 2559 |
container_title | Behavior research methods |
container_volume | 55 |
creator | Miolla, A. Cardaioli, M. Scarpazza, C. |
description | Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans’ interpretation of and reaction to various emotions. However, scientific research on emotion mainly relied on static pictures of facial expressions posed (i.e., simulated) by actors, creating a significant bias in emotion literature. This dataset tries to fill this gap, providing a considerable amount (
N
= 1458) of dynamic genuine (
N
= 707) and posed (
N
= 751) clips of the six universal emotions from 56 participants. The dataset is available in two versions: original clips, including participants’ body and background, and modified clips, where only the face of participants is visible. Notably, the original dataset has been validated by 122 human raters, while the modified dataset has been validated by 280 human raters. Hit rates for emotion and genuineness, as well as the mean, standard deviation of genuineness, and intensity perception, are provided for each clip to allow future users to select the most appropriate clips needed to answer their scientific questions. |
doi_str_mv | 10.3758/s13428-022-01914-4 |
format | article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_10439033</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2853052283</sourcerecordid><originalsourceid>FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3</originalsourceid><addsrcrecordid>eNp9kU1PFTEUhhujEQT_gKsmbmAx0p5-jhtDYK6QkMhC1k1n2rkOmdte2xkC_nqLQ7jiwlWbc57z5LQvQh8o-cSU0CeZMg66IgAVoTXlFX-F9qkQvGIC9Ou_7nvoXc63hDANlL9Fe0wSAhJgH_26ti7eWdxs4jTEYEd8bieb_YRjj1e2G0qlud8mn3NpZ3x03ZyvmuPP-BTPYfg5e-x2_NqHeQge2-DwNmbvsH_W9ovL71yH6E1vx-zfP50H6GbVfD-7qK6-fb08O72qOkHrqVJcUk6U5gxU37MOHEjea1fXreCKKWBUg2hr5uqWu7YVUinV8d4KqUvVswP0ZfFu53bjXefDlOxotmnY2PRgoh3My04Yfph1vDOUcFYTxorh6MmQYnlynsxmyJ0fRxt8nLMBRaSiopayoB__QW_jnMoHFEoLRgSAfhTCQnUp5px8_7wNJeYxW7Nka0q25k-2hpchtgzlAoe1Tzv1f6Z-A4fppUc</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2853052283</pqid></control><display><type>article</type><title>Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions</title><source>Springer Nature</source><creator>Miolla, A. ; Cardaioli, M. ; Scarpazza, C.</creator><creatorcontrib>Miolla, A. ; Cardaioli, M. ; Scarpazza, C.</creatorcontrib><description>Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans’ interpretation of and reaction to various emotions. However, scientific research on emotion mainly relied on static pictures of facial expressions posed (i.e., simulated) by actors, creating a significant bias in emotion literature. This dataset tries to fill this gap, providing a considerable amount (
N
= 1458) of dynamic genuine (
N
= 707) and posed (
N
= 751) clips of the six universal emotions from 56 participants. The dataset is available in two versions: original clips, including participants’ body and background, and modified clips, where only the face of participants is visible. Notably, the original dataset has been validated by 122 human raters, while the modified dataset has been validated by 280 human raters. Hit rates for emotion and genuineness, as well as the mean, standard deviation of genuineness, and intensity perception, are provided for each clip to allow future users to select the most appropriate clips needed to answer their scientific questions.</description><identifier>ISSN: 1554-3528</identifier><identifier>ISSN: 1554-351X</identifier><identifier>EISSN: 1554-3528</identifier><identifier>DOI: 10.3758/s13428-022-01914-4</identifier><identifier>PMID: 36002622</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Behavioral Science and Psychology ; Cognitive Psychology ; Datasets ; Emotions ; Psychology</subject><ispartof>Behavior research methods, 2023-08, Vol.55 (5), p.2559-2574</ispartof><rights>The Author(s) 2022</rights><rights>The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3</citedby><cites>FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>230,314,780,784,885,27924,27925</link.rule.ids></links><search><creatorcontrib>Miolla, A.</creatorcontrib><creatorcontrib>Cardaioli, M.</creatorcontrib><creatorcontrib>Scarpazza, C.</creatorcontrib><title>Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions</title><title>Behavior research methods</title><addtitle>Behav Res</addtitle><description>Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans’ interpretation of and reaction to various emotions. However, scientific research on emotion mainly relied on static pictures of facial expressions posed (i.e., simulated) by actors, creating a significant bias in emotion literature. This dataset tries to fill this gap, providing a considerable amount (
N
= 1458) of dynamic genuine (
N
= 707) and posed (
N
= 751) clips of the six universal emotions from 56 participants. The dataset is available in two versions: original clips, including participants’ body and background, and modified clips, where only the face of participants is visible. Notably, the original dataset has been validated by 122 human raters, while the modified dataset has been validated by 280 human raters. Hit rates for emotion and genuineness, as well as the mean, standard deviation of genuineness, and intensity perception, are provided for each clip to allow future users to select the most appropriate clips needed to answer their scientific questions.</description><subject>Behavioral Science and Psychology</subject><subject>Cognitive Psychology</subject><subject>Datasets</subject><subject>Emotions</subject><subject>Psychology</subject><issn>1554-3528</issn><issn>1554-351X</issn><issn>1554-3528</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kU1PFTEUhhujEQT_gKsmbmAx0p5-jhtDYK6QkMhC1k1n2rkOmdte2xkC_nqLQ7jiwlWbc57z5LQvQh8o-cSU0CeZMg66IgAVoTXlFX-F9qkQvGIC9Ou_7nvoXc63hDANlL9Fe0wSAhJgH_26ti7eWdxs4jTEYEd8bieb_YRjj1e2G0qlud8mn3NpZ3x03ZyvmuPP-BTPYfg5e-x2_NqHeQge2-DwNmbvsH_W9ovL71yH6E1vx-zfP50H6GbVfD-7qK6-fb08O72qOkHrqVJcUk6U5gxU37MOHEjea1fXreCKKWBUg2hr5uqWu7YVUinV8d4KqUvVswP0ZfFu53bjXefDlOxotmnY2PRgoh3My04Yfph1vDOUcFYTxorh6MmQYnlynsxmyJ0fRxt8nLMBRaSiopayoB__QW_jnMoHFEoLRgSAfhTCQnUp5px8_7wNJeYxW7Nka0q25k-2hpchtgzlAoe1Tzv1f6Z-A4fppUc</recordid><startdate>20230801</startdate><enddate>20230801</enddate><creator>Miolla, A.</creator><creator>Cardaioli, M.</creator><creator>Scarpazza, C.</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>4T-</scope><scope>7TK</scope><scope>K9.</scope><scope>7X8</scope><scope>5PM</scope></search><sort><creationdate>20230801</creationdate><title>Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions</title><author>Miolla, A. ; Cardaioli, M. ; Scarpazza, C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Behavioral Science and Psychology</topic><topic>Cognitive Psychology</topic><topic>Datasets</topic><topic>Emotions</topic><topic>Psychology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Miolla, A.</creatorcontrib><creatorcontrib>Cardaioli, M.</creatorcontrib><creatorcontrib>Scarpazza, C.</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>Docstoc</collection><collection>Neurosciences Abstracts</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Behavior research methods</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Miolla, A.</au><au>Cardaioli, M.</au><au>Scarpazza, C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions</atitle><jtitle>Behavior research methods</jtitle><stitle>Behav Res</stitle><date>2023-08-01</date><risdate>2023</risdate><volume>55</volume><issue>5</issue><spage>2559</spage><epage>2574</epage><pages>2559-2574</pages><issn>1554-3528</issn><issn>1554-351X</issn><eissn>1554-3528</eissn><abstract>Facial expressions are among the most powerful signals for human beings to convey their emotional states. Indeed, emotional facial datasets represent the most effective and controlled method of examining humans’ interpretation of and reaction to various emotions. However, scientific research on emotion mainly relied on static pictures of facial expressions posed (i.e., simulated) by actors, creating a significant bias in emotion literature. This dataset tries to fill this gap, providing a considerable amount (
N
= 1458) of dynamic genuine (
N
= 707) and posed (
N
= 751) clips of the six universal emotions from 56 participants. The dataset is available in two versions: original clips, including participants’ body and background, and modified clips, where only the face of participants is visible. Notably, the original dataset has been validated by 122 human raters, while the modified dataset has been validated by 280 human raters. Hit rates for emotion and genuineness, as well as the mean, standard deviation of genuineness, and intensity perception, are provided for each clip to allow future users to select the most appropriate clips needed to answer their scientific questions.</abstract><cop>New York</cop><pub>Springer US</pub><pmid>36002622</pmid><doi>10.3758/s13428-022-01914-4</doi><tpages>16</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1554-3528 |
ispartof | Behavior research methods, 2023-08, Vol.55 (5), p.2559-2574 |
issn | 1554-3528 1554-351X 1554-3528 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_10439033 |
source | Springer Nature |
subjects | Behavioral Science and Psychology Cognitive Psychology Datasets Emotions Psychology |
title | Padova Emotional Dataset of Facial Expressions (PEDFE): A unique dataset of genuine and posed emotional facial expressions |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T22%3A38%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Padova%20Emotional%20Dataset%20of%20Facial%20Expressions%20(PEDFE):%20A%20unique%20dataset%20of%20genuine%20and%20posed%20emotional%20facial%20expressions&rft.jtitle=Behavior%20research%20methods&rft.au=Miolla,%20A.&rft.date=2023-08-01&rft.volume=55&rft.issue=5&rft.spage=2559&rft.epage=2574&rft.pages=2559-2574&rft.issn=1554-3528&rft.eissn=1554-3528&rft_id=info:doi/10.3758/s13428-022-01914-4&rft_dat=%3Cproquest_pubme%3E2853052283%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c519t-746140784327ff3c2d264f8d99b54737231825b93d9b4dbb56777c4fa568b93e3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2853052283&rft_id=info:pmid/36002622&rfr_iscdi=true |