Loading…

Comparison of visual servoing technologies for robotized aerospace structural assembly and inspection

•Automatic online path generation in robotized aircraft assembly processes.•Detect features associated with mechanical bounds over the aircraft structure.•It absorbs the positioning variations in robotized aircraft assembly processes.•Fusion between 2D-beam scanner and robot visual servoing in Eye-i...

Full description

Saved in:
Bibliographic Details
Published in:Robotics and computer-integrated manufacturing 2022-02, Vol.73, p.102237, Article 102237
Main Authors: Santos, Kleber Roberto da Silva, Villani, Emília, de Oliveira, Wesley Rodrigues, Dttman, Augusto
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•Automatic online path generation in robotized aircraft assembly processes.•Detect features associated with mechanical bounds over the aircraft structure.•It absorbs the positioning variations in robotized aircraft assembly processes.•Fusion between 2D-beam scanner and robot visual servoing in Eye-in-hand configuration.•Path accuracy validation with ISO 9283 criteria. This work presents a novel approach for visual servoing of robotized aerospace manufacturing cells, based on the combined use of a camera and a 2D-beam scanner and a 1-D beam distance sensor attached to the end-effector of a collaborative robot. The proposed system can detect features associated with mechanical bounds over the aircraft structure, making possible the robot automatic online trajectory/path generation when the robot performs a target task over an aeronautical part. The effectiveness of this method is demonstrated by means of experimental evaluations carried out in unstructured environments without illumination and temperature control (simulating real shop floor conditions), evincing that the proposed approach is more robust. We also show that it is able to automatically generate and follow a target path with an accuracy of 0.40 mm and repeatability of 0.59 mm, which is roughly 2 times more accurate than the classical computer vision servoing used in the experiments. The proposed solution is suitable to applications in modern collaborative robotized aerospace assembly cells.
ISSN:0736-5845
1879-2537
DOI:10.1016/j.rcim.2021.102237