Loading…

A Sociable Human-robot Interaction Scheme Based on Body Emotion Analysis

Many kinds of interaction schemes for human-robot interaction (HRI) have been reported in recent years. However, most of these schemes are realized by recognizing the human actions. Once the recognition algorithm fails, the robot’s reactions will not be able to proceed further. This issue is thought...

Full description

Saved in:
Bibliographic Details
Published in:International journal of control, automation, and systems 2019, Automation, and Systems, 17(2), , pp.474-485
Main Authors: Zhu, Tehao, Xia, Zeyang, Dong, Jiaqi, Zhao, Qunfei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many kinds of interaction schemes for human-robot interaction (HRI) have been reported in recent years. However, most of these schemes are realized by recognizing the human actions. Once the recognition algorithm fails, the robot’s reactions will not be able to proceed further. This issue is thoughtless in traditional HRI, but is the key point to further improve the fluency and friendliness of HRI. In this work, a sociable HRI (SoHRI) scheme based on body emotion analysis was developed to achieve reasonable and natural interaction while human actions were not recognized. First, the emotions from the dynamic movements and static poses of humans were quantified using Laban movement analysis. Second, an interaction strategy including a finite state machine model was designed to describe the transition regulations of the human emotion state. Finally, appropriate interactive behavior of the robot was selected according to the inferred human emotion state. The quantification effect of SoHRI was verified using the dataset UTD-MHAD, and the whole scheme was tested using questionnaires filled out by the participants and spectators. The experimental results showed that the SoHRI scheme can analyze the body emotion precisely, and help the robot make reasonable interactive behaviors.
ISSN:1598-6446
2005-4092
DOI:10.1007/s12555-017-0423-5