Loading…

Acted vs. natural frustration and delight: Many people smile in natural frustration

This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing eithe...

Full description

Saved in:
Bibliographic Details
Main Authors: Hoque, M, Picard, R W
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 359
container_issue
container_start_page 354
container_title
container_volume
creator Hoque, M
Picard, R W
description This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing either delight or frustration; the second experiment tries to elicit these states directly through a frustrating experience and through a delightful video. We find two significant differences in the nature of the acted vs. natural occurrences of expressions. First, the acted ones are much easier for the computer to recognize. Second, in 90% of the acted cases, participants did not smile when frustrated, whereas in 90% of the natural cases, participants smiled during the frustrating interaction, despite self-reporting significant frustration with the experience. This paper begins to explore the differences in the patterns of smiling that are seen under natural frustration and delight conditions, to see if there might be something measurably different about the smiles in these two cases, which could ultimately improve the performance of classifiers applied to natural expressions.
doi_str_mv 10.1109/FG.2011.5771425
format conference_proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5771425</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5771425</ieee_id><sourcerecordid>5771425</sourcerecordid><originalsourceid>FETCH-LOGICAL-i1975-c1769de1be2ad60f4c25c1cd3fc3536c87a56124b5fbdecaf97f0c41a86a9aa13</originalsourceid><addsrcrecordid>eNptkD9PwzAUxI0QElA6M7D4CyT4xf9itqqiBamIAZirF_sZjNI0ilOkfnuKqJhY7qcb7nQ6xq5BlADC3S6WZSUASm0tqEqfsMsDlHIgXX36Z5SAczbN-VMIAUJqacwFe5n5kQL_yiXvcNwN2PI47PI44Ji2Hccu8EBtev8Y7_gTdnve07ZviedNOmjq_ktdsbOIbabpkRP2trh_nT8Uq-fl43y2KhI4qwsP1rhA0FCFwYiofKU9-CCj_xnna4vaQKUaHZtAHqOzUXgFWBt0iCAn7Oa3NxHRuh_SBof9-niC_AY3GFD_</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Acted vs. natural frustration and delight: Many people smile in natural frustration</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Hoque, M ; Picard, R W</creator><creatorcontrib>Hoque, M ; Picard, R W</creatorcontrib><description>This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing either delight or frustration; the second experiment tries to elicit these states directly through a frustrating experience and through a delightful video. We find two significant differences in the nature of the acted vs. natural occurrences of expressions. First, the acted ones are much easier for the computer to recognize. Second, in 90% of the acted cases, participants did not smile when frustrated, whereas in 90% of the natural cases, participants smiled during the frustrating interaction, despite self-reporting significant frustration with the experience. This paper begins to explore the differences in the patterns of smiling that are seen under natural frustration and delight conditions, to see if there might be something measurably different about the smiles in these two cases, which could ultimately improve the performance of classifiers applied to natural expressions.</description><identifier>ISBN: 1424491401</identifier><identifier>ISBN: 9781424491407</identifier><identifier>EISBN: 1424491398</identifier><identifier>EISBN: 9781424491414</identifier><identifier>EISBN: 9781424491391</identifier><identifier>EISBN: 142449141X</identifier><identifier>DOI: 10.1109/FG.2011.5771425</identifier><language>eng</language><publisher>IEEE</publisher><subject>Accuracy ; Avatars ; Cameras ; Computers ; Face ; Feature extraction ; machine learning ; natural vs. acted data ; smile while frustrated ; Speech</subject><ispartof>Face and Gesture 2011, 2011, p.354-359</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5771425$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2051,27904,54899</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5771425$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Hoque, M</creatorcontrib><creatorcontrib>Picard, R W</creatorcontrib><title>Acted vs. natural frustration and delight: Many people smile in natural frustration</title><title>Face and Gesture 2011</title><addtitle>FG</addtitle><description>This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing either delight or frustration; the second experiment tries to elicit these states directly through a frustrating experience and through a delightful video. We find two significant differences in the nature of the acted vs. natural occurrences of expressions. First, the acted ones are much easier for the computer to recognize. Second, in 90% of the acted cases, participants did not smile when frustrated, whereas in 90% of the natural cases, participants smiled during the frustrating interaction, despite self-reporting significant frustration with the experience. This paper begins to explore the differences in the patterns of smiling that are seen under natural frustration and delight conditions, to see if there might be something measurably different about the smiles in these two cases, which could ultimately improve the performance of classifiers applied to natural expressions.</description><subject>Accuracy</subject><subject>Avatars</subject><subject>Cameras</subject><subject>Computers</subject><subject>Face</subject><subject>Feature extraction</subject><subject>machine learning</subject><subject>natural vs. acted data</subject><subject>smile while frustrated</subject><subject>Speech</subject><isbn>1424491401</isbn><isbn>9781424491407</isbn><isbn>1424491398</isbn><isbn>9781424491414</isbn><isbn>9781424491391</isbn><isbn>142449141X</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2011</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNptkD9PwzAUxI0QElA6M7D4CyT4xf9itqqiBamIAZirF_sZjNI0ilOkfnuKqJhY7qcb7nQ6xq5BlADC3S6WZSUASm0tqEqfsMsDlHIgXX36Z5SAczbN-VMIAUJqacwFe5n5kQL_yiXvcNwN2PI47PI44Ji2Hccu8EBtev8Y7_gTdnve07ZviedNOmjq_ktdsbOIbabpkRP2trh_nT8Uq-fl43y2KhI4qwsP1rhA0FCFwYiofKU9-CCj_xnna4vaQKUaHZtAHqOzUXgFWBt0iCAn7Oa3NxHRuh_SBof9-niC_AY3GFD_</recordid><startdate>201103</startdate><enddate>201103</enddate><creator>Hoque, M</creator><creator>Picard, R W</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201103</creationdate><title>Acted vs. natural frustration and delight: Many people smile in natural frustration</title><author>Hoque, M ; Picard, R W</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i1975-c1769de1be2ad60f4c25c1cd3fc3536c87a56124b5fbdecaf97f0c41a86a9aa13</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Accuracy</topic><topic>Avatars</topic><topic>Cameras</topic><topic>Computers</topic><topic>Face</topic><topic>Feature extraction</topic><topic>machine learning</topic><topic>natural vs. acted data</topic><topic>smile while frustrated</topic><topic>Speech</topic><toplevel>online_resources</toplevel><creatorcontrib>Hoque, M</creatorcontrib><creatorcontrib>Picard, R W</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hoque, M</au><au>Picard, R W</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Acted vs. natural frustration and delight: Many people smile in natural frustration</atitle><btitle>Face and Gesture 2011</btitle><stitle>FG</stitle><date>2011-03</date><risdate>2011</risdate><spage>354</spage><epage>359</epage><pages>354-359</pages><isbn>1424491401</isbn><isbn>9781424491407</isbn><eisbn>1424491398</eisbn><eisbn>9781424491414</eisbn><eisbn>9781424491391</eisbn><eisbn>142449141X</eisbn><abstract>This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing either delight or frustration; the second experiment tries to elicit these states directly through a frustrating experience and through a delightful video. We find two significant differences in the nature of the acted vs. natural occurrences of expressions. First, the acted ones are much easier for the computer to recognize. Second, in 90% of the acted cases, participants did not smile when frustrated, whereas in 90% of the natural cases, participants smiled during the frustrating interaction, despite self-reporting significant frustration with the experience. This paper begins to explore the differences in the patterns of smiling that are seen under natural frustration and delight conditions, to see if there might be something measurably different about the smiles in these two cases, which could ultimately improve the performance of classifiers applied to natural expressions.</abstract><pub>IEEE</pub><doi>10.1109/FG.2011.5771425</doi><tpages>6</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 1424491401
ispartof Face and Gesture 2011, 2011, p.354-359
issn
language eng
recordid cdi_ieee_primary_5771425
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Accuracy
Avatars
Cameras
Computers
Face
Feature extraction
machine learning
natural vs. acted data
smile while frustrated
Speech
title Acted vs. natural frustration and delight: Many people smile in natural frustration
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T14%3A27%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Acted%20vs.%20natural%20frustration%20and%20delight:%20Many%20people%20smile%20in%20natural%20frustration&rft.btitle=Face%20and%20Gesture%202011&rft.au=Hoque,%20M&rft.date=2011-03&rft.spage=354&rft.epage=359&rft.pages=354-359&rft.isbn=1424491401&rft.isbn_list=9781424491407&rft_id=info:doi/10.1109/FG.2011.5771425&rft.eisbn=1424491398&rft.eisbn_list=9781424491414&rft.eisbn_list=9781424491391&rft.eisbn_list=142449141X&rft_dat=%3Cieee_6IE%3E5771425%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i1975-c1769de1be2ad60f4c25c1cd3fc3536c87a56124b5fbdecaf97f0c41a86a9aa13%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5771425&rfr_iscdi=true