Loading…

A gaze control of socially interactive robots in multiple-person interaction

This paper proposes a computational model for selecting a suitable interlocutor of socially interactive robots in a situation interacting with multiple persons. To support this, a hybrid approach incorporating gaze control criteria and perceptual measurements for social cues is applied to the robot....

Full description

Saved in:
Bibliographic Details
Published in:Robotica 2017-11, Vol.35 (11), p.2122-2138
Main Author: Yun, Sang-Seok
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper proposes a computational model for selecting a suitable interlocutor of socially interactive robots in a situation interacting with multiple persons. To support this, a hybrid approach incorporating gaze control criteria and perceptual measurements for social cues is applied to the robot. For the perception part, representative non-verbal behaviors indicating human-interaction intent are designed based on the psychological analysis of human–human interaction, and these behavioral features are quantitatively measured by core perceptual components including visual, auditory, and spatial modalities. In addition, each aspect of recognition performance is improved through temporal confidence reasoning as a post-processing step. On the other hand, two factors of the physical space and conversational intimacy are tactically applied to the model calculation as a way of strengthening social gaze control effect of the robot. Interaction experiments with performance evaluation are given to verify that the proposed model is suitable to assess intended behaviors of individuals and perform gaze behavior about multiple persons. By showing a success rate of 93.3% in human decision-making criteria, it confirms a potential to establish socially acceptable gaze control in multiple-person interaction.
ISSN:0263-5747
1469-8668
DOI:10.1017/S0263574716000722