Loading…
Toward an affect-sensitive multimodal human-computer interaction
The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This...
Saved in:
Published in: | Proceedings of the IEEE 2003-09, Vol.91 (9), p.1370-1390 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3 |
---|---|
cites | cdi_FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3 |
container_end_page | 1390 |
container_issue | 9 |
container_start_page | 1370 |
container_title | Proceedings of the IEEE |
container_volume | 91 |
creator | Pantic, M. Rothkrantz, L.J.M. |
description | The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human-computer interaction (HCI) designs need to include the essence of emotional intelligence - the ability to recognize a user's affective states-in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI-an automatic personalized analyzer of a user's nonverbal affective feedback. |
doi_str_mv | 10.1109/JPROC.2003.817122 |
format | article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_proquest_journals_884246166</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1230215</ieee_id><sourcerecordid>2428978491</sourcerecordid><originalsourceid>FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMouK7-APFSvOilayZfm9yUxU8WVmTvJU1TzNI2a9Iq_nuzVhA8eJm5PO_LzIPQKeAZAFZXT88vq8WMYExnEuZAyB6aAOcyJ4SLfTTBGGSuCKhDdBTjBieQCzpB12v_oUOV6S7TdW1Nn0fbRde7d5u1Q9O71le6yV6HVne58e126G3IXJemNr3z3TE6qHUT7cnPnqL13e168ZAvV_ePi5tlbphgfQ6GC1WKmipTawpMlUxzUZeKMk5xOTdEyZKUYCUj1IC2lVIVYDnHJcfW0im6GGu3wb8NNvZF66KxTaM764dYKAxCKlAykZf_koAJUZhJzBN6_gfd-CF06Y1CpjuYACESBCNkgo8x2LrYBtfq8Jmaip374tt9sXNfjO5T5mzMOGvtL08oJsDpF0MEfx0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>884246166</pqid></control><display><type>article</type><title>Toward an affect-sensitive multimodal human-computer interaction</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Pantic, M. ; Rothkrantz, L.J.M.</creator><creatorcontrib>Pantic, M. ; Rothkrantz, L.J.M.</creatorcontrib><description>The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human-computer interaction (HCI) designs need to include the essence of emotional intelligence - the ability to recognize a user's affective states-in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI-an automatic personalized analyzer of a user's nonverbal affective feedback.</description><identifier>ISSN: 0018-9219</identifier><identifier>EISSN: 1558-2256</identifier><identifier>DOI: 10.1109/JPROC.2003.817122</identifier><identifier>CODEN: IEEPAD</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Animation ; Anthropometry ; Cognitive science ; Design engineering ; Emotion recognition ; Emotional intelligence ; Emotions ; Face detection ; Facial ; Feedback ; Human ; Human computer interaction ; Intelligence ; Interactive ; Knowledge based systems ; Movements ; Neuroscience ; Psychology ; Recognition ; Social interaction ; Tasks</subject><ispartof>Proceedings of the IEEE, 2003-09, Vol.91 (9), p.1370-1390</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2003</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3</citedby><cites>FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1230215$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Pantic, M.</creatorcontrib><creatorcontrib>Rothkrantz, L.J.M.</creatorcontrib><title>Toward an affect-sensitive multimodal human-computer interaction</title><title>Proceedings of the IEEE</title><addtitle>JPROC</addtitle><description>The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human-computer interaction (HCI) designs need to include the essence of emotional intelligence - the ability to recognize a user's affective states-in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI-an automatic personalized analyzer of a user's nonverbal affective feedback.</description><subject>Animation</subject><subject>Anthropometry</subject><subject>Cognitive science</subject><subject>Design engineering</subject><subject>Emotion recognition</subject><subject>Emotional intelligence</subject><subject>Emotions</subject><subject>Face detection</subject><subject>Facial</subject><subject>Feedback</subject><subject>Human</subject><subject>Human computer interaction</subject><subject>Intelligence</subject><subject>Interactive</subject><subject>Knowledge based systems</subject><subject>Movements</subject><subject>Neuroscience</subject><subject>Psychology</subject><subject>Recognition</subject><subject>Social interaction</subject><subject>Tasks</subject><issn>0018-9219</issn><issn>1558-2256</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2003</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LxDAQhoMouK7-APFSvOilayZfm9yUxU8WVmTvJU1TzNI2a9Iq_nuzVhA8eJm5PO_LzIPQKeAZAFZXT88vq8WMYExnEuZAyB6aAOcyJ4SLfTTBGGSuCKhDdBTjBieQCzpB12v_oUOV6S7TdW1Nn0fbRde7d5u1Q9O71le6yV6HVne58e126G3IXJemNr3z3TE6qHUT7cnPnqL13e168ZAvV_ePi5tlbphgfQ6GC1WKmipTawpMlUxzUZeKMk5xOTdEyZKUYCUj1IC2lVIVYDnHJcfW0im6GGu3wb8NNvZF66KxTaM764dYKAxCKlAykZf_koAJUZhJzBN6_gfd-CF06Y1CpjuYACESBCNkgo8x2LrYBtfq8Jmaip374tt9sXNfjO5T5mzMOGvtL08oJsDpF0MEfx0</recordid><startdate>20030901</startdate><enddate>20030901</enddate><creator>Pantic, M.</creator><creator>Rothkrantz, L.J.M.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope><scope>F28</scope><scope>FR3</scope></search><sort><creationdate>20030901</creationdate><title>Toward an affect-sensitive multimodal human-computer interaction</title><author>Pantic, M. ; Rothkrantz, L.J.M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2003</creationdate><topic>Animation</topic><topic>Anthropometry</topic><topic>Cognitive science</topic><topic>Design engineering</topic><topic>Emotion recognition</topic><topic>Emotional intelligence</topic><topic>Emotions</topic><topic>Face detection</topic><topic>Facial</topic><topic>Feedback</topic><topic>Human</topic><topic>Human computer interaction</topic><topic>Intelligence</topic><topic>Interactive</topic><topic>Knowledge based systems</topic><topic>Movements</topic><topic>Neuroscience</topic><topic>Psychology</topic><topic>Recognition</topic><topic>Social interaction</topic><topic>Tasks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pantic, M.</creatorcontrib><creatorcontrib>Rothkrantz, L.J.M.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><jtitle>Proceedings of the IEEE</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Pantic, M.</au><au>Rothkrantz, L.J.M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Toward an affect-sensitive multimodal human-computer interaction</atitle><jtitle>Proceedings of the IEEE</jtitle><stitle>JPROC</stitle><date>2003-09-01</date><risdate>2003</risdate><volume>91</volume><issue>9</issue><spage>1370</spage><epage>1390</epage><pages>1370-1390</pages><issn>0018-9219</issn><eissn>1558-2256</eissn><coden>IEEPAD</coden><abstract>The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human-computer interaction (HCI) designs need to include the essence of emotional intelligence - the ability to recognize a user's affective states-in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI-an automatic personalized analyzer of a user's nonverbal affective feedback.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JPROC.2003.817122</doi><tpages>21</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0018-9219 |
ispartof | Proceedings of the IEEE, 2003-09, Vol.91 (9), p.1370-1390 |
issn | 0018-9219 1558-2256 |
language | eng |
recordid | cdi_proquest_journals_884246166 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Animation Anthropometry Cognitive science Design engineering Emotion recognition Emotional intelligence Emotions Face detection Facial Feedback Human Human computer interaction Intelligence Interactive Knowledge based systems Movements Neuroscience Psychology Recognition Social interaction Tasks |
title | Toward an affect-sensitive multimodal human-computer interaction |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T03%3A49%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Toward%20an%20affect-sensitive%20multimodal%20human-computer%20interaction&rft.jtitle=Proceedings%20of%20the%20IEEE&rft.au=Pantic,%20M.&rft.date=2003-09-01&rft.volume=91&rft.issue=9&rft.spage=1370&rft.epage=1390&rft.pages=1370-1390&rft.issn=0018-9219&rft.eissn=1558-2256&rft.coden=IEEPAD&rft_id=info:doi/10.1109/JPROC.2003.817122&rft_dat=%3Cproquest_ieee_%3E2428978491%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c464t-1c569b6f39cfa3149b4a56fb934530b7c298b2b1e8423c1aed99d10870b50ee3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=884246166&rft_id=info:pmid/&rft_ieee_id=1230215&rfr_iscdi=true |