Loading…

A general constraint-based programming framework for multi-robot applications

Classic task programming methods based on the specification of desired Cartesian frames can easily generate overconstrained task specifications, reducing the motion capabilities of the involved robot(s) and increasing the total programming effort. This paper presents a general constraint-based progr...

Full description

Saved in:
Bibliographic Details
Published in:Robotics and computer-integrated manufacturing 2024-04, Vol.86, p.102665, Article 102665
Main Authors: Fiore, Mario D., Allmendinger, Felix, Natale, Ciro
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Classic task programming methods based on the specification of desired Cartesian frames can easily generate overconstrained task specifications, reducing the motion capabilities of the involved robot(s) and increasing the total programming effort. This paper presents a general constraint-based programming framework for the specification of a task as minimum set of constraints and the automatic generation of motion optimization problems. The framework can handle constraints involving both robot joint and Cartesian coordinates, as well as including explicit time dependency. The proposed formalism naturally scales to robotic applications with multiple robots, on which multiple frames might be of interest. Additionally, the paper proposes a theoretical comparison with already existing constraint-based programming methods. Finally, the validity and the effectiveness of the proposed approach is numerically supported by illustrative examples, as well as by case studies mocking real industrial setups. •Classic industrial task programming relies on tedious trial-and-error procedures.•Constraint-based programming enables specifying tasks as minimum set of constraints.•Constraint-based task descriptions allows exploiting redundant degrees of freedom.•The proposed formalism avoids the use of feature coordinates.•The proposed method supports the generation of dynamically-consistent robot motion.•The proposed framework naturally scales to multi-robot, multi-task applications.
ISSN:0736-5845
1879-2537
DOI:10.1016/j.rcim.2023.102665