Loading…

Emotion Rendering in Auditory Simulations of Imagined Walking Styles

This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to imagined walking scenarios, five emotions (aggressive, happy, neutral, sad, and t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on affective computing 2017-04, Vol.8 (2), p.241-253
Main Authors: Turchet, Luca, Roda, Antonio
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to imagined walking scenarios, five emotions (aggressive, happy, neutral, sad, and tender) by manipulating the parameters of synthetic footstep sounds simulating various combinations of surface materials and shoes types. Results allowed to identify, for the involved emotions and sound conditions, the mean values and ranges of variation of two parameters, sound level and temporal distance between consecutive steps. Results were in accordance with those reported in previous studies on real walking, suggesting that expression of emotions in walking is independent from the real or imagined motor activity. In a second experiment participants were asked to identify the emotions portrayed by walking sounds synthesized by setting the synthesis engine parameters to the mean values found in the first experiment. Results showed that the involved algorithms were successful in conveying the emotional information at a level comparable with previous studies. Both experiments involved musicians and non-musicians. In both experiments, a similar general trend was found between the two groups.
ISSN:1949-3045
1949-3045
DOI:10.1109/TAFFC.2016.2520924