Loading…
Natural Interaction Multimodal Analysis: Expressivity Analysis towards Adaptive and Personalized Interfaces
Intelligent personalized systems often ignore the affective aspectof human behavior and focus more on tactile cues of the useractivity. A complete user modeling, though, should also incorporatecues such as facial expressions, speech prosody and gesture orbody posture expressivity features, in order...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Intelligent personalized systems often ignore the affective aspectof human behavior and focus more on tactile cues of the useractivity. A complete user modeling, though, should also incorporatecues such as facial expressions, speech prosody and gesture orbody posture expressivity features, in order to dynamically profile the user, fusing all available modalities since these qualitative affective cues contain significant information about the user's on verbal behavior and communication. Towards this direction, this work focuses on automatic extraction of gestural and headexpressivity features and related statistical processing. The perspective of adopting a common formalization of using expressivity features for a multitude of visual, emotional modalities is explored and grounded through an overview of experiments on appropriate corpora and the corresponding analysis. |
---|---|
DOI: | 10.1109/SMAP.2012.11 |