Loading…

Using spatial constraints for fast set-up of precise pose estimation in an industrial setting

This paper presents a method for high precision visual pose estimation along with a simple setup procedure. Robotics for industrial solutions is a rapidly growing field and these robots require very precise position information to perform manipulations. This is usually accomplished using e.g. fixtur...

Full description

Saved in:
Bibliographic Details
Main Authors: Hagelskjar, Frederik, Savarimuthu, Thiusius Rajeeth, Kruger, Norbert, Buch, Anders Glent
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a method for high precision visual pose estimation along with a simple setup procedure. Robotics for industrial solutions is a rapidly growing field and these robots require very precise position information to perform manipulations. This is usually accomplished using e.g. fixtures or feeders, both expensive hardware solutions. To enable fast changes in production, more flexible solutions are required, one possibility being visual pose estimation. Although many current pose estimation algorithms show increased performance in terms of recognition rates on public datasets, they do not focus on actual applications, neither in setup complexity or high accuracy during object localization. In contrast, our method focuses on solving a number of specific pose estimation problems in a seamless manner with a simple setup procedure. Our method relies on a number of workcell constraints and employs a novel method for automatically finding stable object poses. In addition, we use an active rendering method for refining the estimated object poses, giving a very fine localization, suitable for robotic manipulation. Experiments with current state-of-the-art 2D algorithms and our method show an average improvement from 9 mm to 0.95 mm uncertainty. The method was also used by the winning team at the 2018 World Robot Summit Assembly Challenge.
ISSN:2161-8089
DOI:10.1109/COASE.2019.8842876