Loading…

Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment

•A SLAM-based VR system is developed for teleoperation of a mobile manipulator.•VR visual aids are used to show the direction and depth information.•The grasping error of the SLAM-based VR teleoperation is 0.5 cm.•The placing threshold of the SLAM-based VR teleoperation is 1.5 cm. Operating a remote...

Full description

Saved in:
Bibliographic Details
Published in:Computers in industry 2021-11, Vol.132, p.103502, Article 103502
Main Authors: Kuo, Chen-Yu, Huang, Chun-Chi, Tsai, Chih-Hsuan, Shi, Yun-Shuo, Smith, Shana
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03
cites cdi_FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03
container_end_page
container_issue
container_start_page 103502
container_title Computers in industry
container_volume 132
creator Kuo, Chen-Yu
Huang, Chun-Chi
Tsai, Chih-Hsuan
Shi, Yun-Shuo
Smith, Shana
description •A SLAM-based VR system is developed for teleoperation of a mobile manipulator.•VR visual aids are used to show the direction and depth information.•The grasping error of the SLAM-based VR teleoperation is 0.5 cm.•The placing threshold of the SLAM-based VR teleoperation is 1.5 cm. Operating a remote robot in an unknown environment is a challenge. Usually, a camera is attached to the robot to capture the images of the remote scenes. However, it is difficult to perceive a 3D environment using a 2D display. In this study, a real-time simultaneous localization and mapping (SLAM)-based virtual reality (VR) 3D human-machine interaction system is developed to provide users with immersive telepresence to better operate a remote mobile manipulator in an unknown environment. An RGB-depth camera is mounted on the mobile platform to scan and reconstruct the remote environment in real time. A head mounted display is used to display the 3D reconstructed environment, with a virtual copy of the actual robot in it, to provide users the feeling of immersion at the remote site. Visual aids are added in the 3D reconstructed environment to guide users with the direction and distance information. Users can control the remote robot by manipulating the virtual copy of the actual robot in the VR environment, without being constrained in the robot first person view. The system is compared with a video-based teleoperation and a direct on-site operation. Experimental results show that the grasping error and placing threshold of the SLAM-based VR teleoperation are 0.5 cm and 1.5 cm, respectively. The average error of the reconstructed environment is less than 4%. Within the accuracy of the reconstructed environment, the SLAM-based VR teleoperation has a higher grasping success rate and lower operation time, compared to the video-based teleoperation. This study shows a promising application of immersive SLAM-based VR in interactive telerobotic operations in a remote unknown environment.
doi_str_mv 10.1016/j.compind.2021.103502
format article
fullrecord <record><control><sourceid>elsevier_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1016_j_compind_2021_103502</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0166361521001093</els_id><sourcerecordid>S0166361521001093</sourcerecordid><originalsourceid>FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03</originalsourceid><addsrcrecordid>eNqFkFtLAzEQhYMoWKs_Qcgf2JrsLd0nKfUKFcHba9gkE0jdJEuSrvTfu2v77rwMDOecOXwIXVOyoITWN9uF9LY3Ti1yktPxVlQkP0EzumR5VtOmPEWzUVdnRU2rc3QR45aMw1g9Q-kOBuh8b8El7DVuHTbWQohmAPy-Wb1koo2g8NcbjvuYwGLtA07Qge8htMl492fD1gvTAbatM_2ua9OoMm6K27lv538cBjeY4N306BKd6baLcHXcc_T5cP-xfso2r4_P69UmkwVpUsbKUpKmEGUjWE6gUUCZzutcCVqRSrNCVUITaEnFBNPLnELNhCSCalUslSDFHFWHXBl8jAE074OxbdhzSviEjm_5ER2f0PEDutF3e_DBWG4wEHiUBpwEZQLIxJU3_yT8Alz2fEc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment</title><source>ScienceDirect Journals</source><creator>Kuo, Chen-Yu ; Huang, Chun-Chi ; Tsai, Chih-Hsuan ; Shi, Yun-Shuo ; Smith, Shana</creator><creatorcontrib>Kuo, Chen-Yu ; Huang, Chun-Chi ; Tsai, Chih-Hsuan ; Shi, Yun-Shuo ; Smith, Shana</creatorcontrib><description>•A SLAM-based VR system is developed for teleoperation of a mobile manipulator.•VR visual aids are used to show the direction and depth information.•The grasping error of the SLAM-based VR teleoperation is 0.5 cm.•The placing threshold of the SLAM-based VR teleoperation is 1.5 cm. Operating a remote robot in an unknown environment is a challenge. Usually, a camera is attached to the robot to capture the images of the remote scenes. However, it is difficult to perceive a 3D environment using a 2D display. In this study, a real-time simultaneous localization and mapping (SLAM)-based virtual reality (VR) 3D human-machine interaction system is developed to provide users with immersive telepresence to better operate a remote mobile manipulator in an unknown environment. An RGB-depth camera is mounted on the mobile platform to scan and reconstruct the remote environment in real time. A head mounted display is used to display the 3D reconstructed environment, with a virtual copy of the actual robot in it, to provide users the feeling of immersion at the remote site. Visual aids are added in the 3D reconstructed environment to guide users with the direction and distance information. Users can control the remote robot by manipulating the virtual copy of the actual robot in the VR environment, without being constrained in the robot first person view. The system is compared with a video-based teleoperation and a direct on-site operation. Experimental results show that the grasping error and placing threshold of the SLAM-based VR teleoperation are 0.5 cm and 1.5 cm, respectively. The average error of the reconstructed environment is less than 4%. Within the accuracy of the reconstructed environment, the SLAM-based VR teleoperation has a higher grasping success rate and lower operation time, compared to the video-based teleoperation. This study shows a promising application of immersive SLAM-based VR in interactive telerobotic operations in a remote unknown environment.</description><identifier>ISSN: 0166-3615</identifier><identifier>EISSN: 1872-6194</identifier><identifier>DOI: 10.1016/j.compind.2021.103502</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>Immersive ; Remote mobile manipulator ; SLAM ; Virtual reality ; Visual aids</subject><ispartof>Computers in industry, 2021-11, Vol.132, p.103502, Article 103502</ispartof><rights>2021 Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03</citedby><cites>FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03</cites><orcidid>0000-0002-9999-488X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27922,27923</link.rule.ids></links><search><creatorcontrib>Kuo, Chen-Yu</creatorcontrib><creatorcontrib>Huang, Chun-Chi</creatorcontrib><creatorcontrib>Tsai, Chih-Hsuan</creatorcontrib><creatorcontrib>Shi, Yun-Shuo</creatorcontrib><creatorcontrib>Smith, Shana</creatorcontrib><title>Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment</title><title>Computers in industry</title><description>•A SLAM-based VR system is developed for teleoperation of a mobile manipulator.•VR visual aids are used to show the direction and depth information.•The grasping error of the SLAM-based VR teleoperation is 0.5 cm.•The placing threshold of the SLAM-based VR teleoperation is 1.5 cm. Operating a remote robot in an unknown environment is a challenge. Usually, a camera is attached to the robot to capture the images of the remote scenes. However, it is difficult to perceive a 3D environment using a 2D display. In this study, a real-time simultaneous localization and mapping (SLAM)-based virtual reality (VR) 3D human-machine interaction system is developed to provide users with immersive telepresence to better operate a remote mobile manipulator in an unknown environment. An RGB-depth camera is mounted on the mobile platform to scan and reconstruct the remote environment in real time. A head mounted display is used to display the 3D reconstructed environment, with a virtual copy of the actual robot in it, to provide users the feeling of immersion at the remote site. Visual aids are added in the 3D reconstructed environment to guide users with the direction and distance information. Users can control the remote robot by manipulating the virtual copy of the actual robot in the VR environment, without being constrained in the robot first person view. The system is compared with a video-based teleoperation and a direct on-site operation. Experimental results show that the grasping error and placing threshold of the SLAM-based VR teleoperation are 0.5 cm and 1.5 cm, respectively. The average error of the reconstructed environment is less than 4%. Within the accuracy of the reconstructed environment, the SLAM-based VR teleoperation has a higher grasping success rate and lower operation time, compared to the video-based teleoperation. This study shows a promising application of immersive SLAM-based VR in interactive telerobotic operations in a remote unknown environment.</description><subject>Immersive</subject><subject>Remote mobile manipulator</subject><subject>SLAM</subject><subject>Virtual reality</subject><subject>Visual aids</subject><issn>0166-3615</issn><issn>1872-6194</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNqFkFtLAzEQhYMoWKs_Qcgf2JrsLd0nKfUKFcHba9gkE0jdJEuSrvTfu2v77rwMDOecOXwIXVOyoITWN9uF9LY3Ti1yktPxVlQkP0EzumR5VtOmPEWzUVdnRU2rc3QR45aMw1g9Q-kOBuh8b8El7DVuHTbWQohmAPy-Wb1koo2g8NcbjvuYwGLtA07Qge8htMl492fD1gvTAbatM_2ua9OoMm6K27lv538cBjeY4N306BKd6baLcHXcc_T5cP-xfso2r4_P69UmkwVpUsbKUpKmEGUjWE6gUUCZzutcCVqRSrNCVUITaEnFBNPLnELNhCSCalUslSDFHFWHXBl8jAE074OxbdhzSviEjm_5ER2f0PEDutF3e_DBWG4wEHiUBpwEZQLIxJU3_yT8Alz2fEc</recordid><startdate>202111</startdate><enddate>202111</enddate><creator>Kuo, Chen-Yu</creator><creator>Huang, Chun-Chi</creator><creator>Tsai, Chih-Hsuan</creator><creator>Shi, Yun-Shuo</creator><creator>Smith, Shana</creator><general>Elsevier B.V</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-9999-488X</orcidid></search><sort><creationdate>202111</creationdate><title>Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment</title><author>Kuo, Chen-Yu ; Huang, Chun-Chi ; Tsai, Chih-Hsuan ; Shi, Yun-Shuo ; Smith, Shana</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Immersive</topic><topic>Remote mobile manipulator</topic><topic>SLAM</topic><topic>Virtual reality</topic><topic>Visual aids</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kuo, Chen-Yu</creatorcontrib><creatorcontrib>Huang, Chun-Chi</creatorcontrib><creatorcontrib>Tsai, Chih-Hsuan</creatorcontrib><creatorcontrib>Shi, Yun-Shuo</creatorcontrib><creatorcontrib>Smith, Shana</creatorcontrib><collection>CrossRef</collection><jtitle>Computers in industry</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kuo, Chen-Yu</au><au>Huang, Chun-Chi</au><au>Tsai, Chih-Hsuan</au><au>Shi, Yun-Shuo</au><au>Smith, Shana</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment</atitle><jtitle>Computers in industry</jtitle><date>2021-11</date><risdate>2021</risdate><volume>132</volume><spage>103502</spage><pages>103502-</pages><artnum>103502</artnum><issn>0166-3615</issn><eissn>1872-6194</eissn><abstract>•A SLAM-based VR system is developed for teleoperation of a mobile manipulator.•VR visual aids are used to show the direction and depth information.•The grasping error of the SLAM-based VR teleoperation is 0.5 cm.•The placing threshold of the SLAM-based VR teleoperation is 1.5 cm. Operating a remote robot in an unknown environment is a challenge. Usually, a camera is attached to the robot to capture the images of the remote scenes. However, it is difficult to perceive a 3D environment using a 2D display. In this study, a real-time simultaneous localization and mapping (SLAM)-based virtual reality (VR) 3D human-machine interaction system is developed to provide users with immersive telepresence to better operate a remote mobile manipulator in an unknown environment. An RGB-depth camera is mounted on the mobile platform to scan and reconstruct the remote environment in real time. A head mounted display is used to display the 3D reconstructed environment, with a virtual copy of the actual robot in it, to provide users the feeling of immersion at the remote site. Visual aids are added in the 3D reconstructed environment to guide users with the direction and distance information. Users can control the remote robot by manipulating the virtual copy of the actual robot in the VR environment, without being constrained in the robot first person view. The system is compared with a video-based teleoperation and a direct on-site operation. Experimental results show that the grasping error and placing threshold of the SLAM-based VR teleoperation are 0.5 cm and 1.5 cm, respectively. The average error of the reconstructed environment is less than 4%. Within the accuracy of the reconstructed environment, the SLAM-based VR teleoperation has a higher grasping success rate and lower operation time, compared to the video-based teleoperation. This study shows a promising application of immersive SLAM-based VR in interactive telerobotic operations in a remote unknown environment.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.compind.2021.103502</doi><orcidid>https://orcid.org/0000-0002-9999-488X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0166-3615
ispartof Computers in industry, 2021-11, Vol.132, p.103502, Article 103502
issn 0166-3615
1872-6194
language eng
recordid cdi_crossref_primary_10_1016_j_compind_2021_103502
source ScienceDirect Journals
subjects Immersive
Remote mobile manipulator
SLAM
Virtual reality
Visual aids
title Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T20%3A12%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-elsevier_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Development%20of%20an%20immersive%20SLAM-based%20VR%20system%20for%20teleoperation%20of%20a%20mobile%20manipulator%20in%20an%20unknown%20environment&rft.jtitle=Computers%20in%20industry&rft.au=Kuo,%20Chen-Yu&rft.date=2021-11&rft.volume=132&rft.spage=103502&rft.pages=103502-&rft.artnum=103502&rft.issn=0166-3615&rft.eissn=1872-6194&rft_id=info:doi/10.1016/j.compind.2021.103502&rft_dat=%3Celsevier_cross%3ES0166361521001093%3C/elsevier_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c309t-744c093b49b720e9de17f262db1505f73d5bf0ea057b7f821e67bc0b1fd38db03%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true