Loading…

Learning Motion Patterns of People for Compliant Robot Motion

Whenever people move through their environments they do not move randomly. Instead, they usually follow specific trajectories or motion patterns corresponding to their intentions. Knowledge about such patterns enables a mobile robot to robustly keep track of persons in its environment and to improve...

Full description

Saved in:
Bibliographic Details
Published in:The International journal of robotics research 2005-01, Vol.24 (1), p.31-48
Main Authors: Bennewitz, Maren, Burgard, Wolfram, Cielniak, Grzegorz, Thrun, Sebastian
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Whenever people move through their environments they do not move randomly. Instead, they usually follow specific trajectories or motion patterns corresponding to their intentions. Knowledge about such patterns enables a mobile robot to robustly keep track of persons in its environment and to improve its behavior. In this paper we propose a technique for learning collections of trajectories that characterize typical motion patterns of persons. Data recorded with laser-range finders are clustered using the expectation maximization algorithm. Based on the result of the clustering process, we derive a hidden Markov model that is applied to estimate the current and future positions of persons based on sensory input. We also describe how to incorporate the probabilistic belief about the potential trajectories of persons into the path planning process of a mobile robot. We present several experiments carried out in different environments with a mobile robot equipped with a laser-range scanner and a camera system. The results demonstrate that our approach can reliably learn motion patterns of persons, can robustly estimate and predict positions of persons, and can be used to improve the navigation behavior of a mobile robot.
ISSN:0278-3649
1741-3176
1741-3176
DOI:10.1177/0278364904048962