Loading…

Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions

Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the...

Full description

Saved in:
Bibliographic Details
Published in:Fusion engineering and design 2011-10, Vol.86 (9), p.2033-2038
Main Authors: Ziaei, Z., Hahto, A., Mattila, J., Siuko, M., Semeraro, L.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the user to perceive virtual computer-generated objects in a real scene. The most common goals usually include visibility enhancement and provision of extra information, such as positional data of various objects. The proposed AR system first recognizes and locates the markerless object by using a template based matching algorithm, and then augments the virtual model on top of the recognized item. The tracking algorithm is exploited for locating the object in a continuous sequence of frames. Conceptually, the template is found by computing the similarity between the template and the image frame, for all the relevant template poses (rotation and translation). As a case study, AR interface was displaying measured orientation and transformation of the Water Hydraulic Manipulator (WHMAN) Divertor preloading tool, in near real-time tracking. The bad viewing condition implies on the case when the view angle is such that the interesting features of the object are not in the field of view. The method in this paper was validated in concrete operational context at DTP2. The developed method proved to deliver robust positional and orientation information while augmenting and tracking the moving tool object.
ISSN:0920-3796
1873-7196
DOI:10.1016/j.fusengdes.2010.12.082