Loading…

Vision-Based Online Adaptation of Motion Primitives to Dynamic Surfaces: Application to an Interactive Robotic Wiping Task

Elderly or disabled people usually need augmented nursing attention both in home and clinical environments, especially to perform bathing activities. The development of an assistive robotic bath system, which constitutes a central motivation of this letter, would increase the independence and safety...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters 2018-07, Vol.3 (3), p.1410-1417
Main Authors: Dometios, Athanasios C., You Zhou, Papageorgiou, Xanthi S., Tzafestas, Costas S., Asfour, Tamim
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Elderly or disabled people usually need augmented nursing attention both in home and clinical environments, especially to perform bathing activities. The development of an assistive robotic bath system, which constitutes a central motivation of this letter, would increase the independence and safety of this procedure, ameliorating in this way the everyday life for this group of people. In general terms, the main goal of this letter is to enable natural, physical human-robot interaction, involving human-friendly and user-adaptive online robot motion planning and interaction control. For this purpose, we employ imitation learning using a leader-follower framework called coordinate change dynamic movement primitives (CC-DMP), in order to incorporate the expertise of professional carers for bathing sequences. In this letter, we propose a vision-based washing system, combining CC-DMP framework with a perception-based controller, to adapt the motion of robot's end effector on moving and deformable surfaces, such as a human body part. The controller guarantees globally uniformly asymptotic convergence to the leader movement primitive while ensuring avoidance of restricted areas, such as sensitive skin body areas. We experimentally tested our approach on a setup including the humanoid robot ARMAR-III and a Kinect v2 camera. The robot executes motions learned from the publicly available KIT whole-body human motion database, achieving good tracking performance in challenging interactive task scenarios.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2018.2800031