Loading…

A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration

Whilst current minimally invasive surgical robots offer many advantages to the surgeon, most of them are still controlled using the traditional master-slave approach, without fully exploiting the complementary strengths of both the human user and the robot. This paper proposes a framework that provi...

Full description

Saved in:
Bibliographic Details
Main Authors: Power, Maura, Rafii-Tari, Hedyeh, Bergeles, Christos, Vitiello, Valentina, Guang-Zhong Yang
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c332t-b9c758ad909a2b19e04e09986b31b9970002beafaa8ebed60487e9e8af81c7f33
cites
container_end_page 5337
container_issue
container_start_page 5330
container_title
container_volume
creator Power, Maura
Rafii-Tari, Hedyeh
Bergeles, Christos
Vitiello, Valentina
Guang-Zhong Yang
description Whilst current minimally invasive surgical robots offer many advantages to the surgeon, most of them are still controlled using the traditional master-slave approach, without fully exploiting the complementary strengths of both the human user and the robot. This paper proposes a framework that provides a cooperative control approach to human-robot interaction. Typical teleoperation is enhanced by incorporating haptic guidance-based feedback for surgical tasks, which are demonstrated to and learned by the robot. Safety in the surgical scene is maintained during reproduction of the learned tasks by including the surgeon in the guided execution of the learned task at all times. Continuous Hidden Markov Models are used for task learning, real-time learned task recognition and generating setpoint trajectories for haptic guidance. Two different surgical training tasks were demonstrated and encoded by the system, and the framework was evaluated using the Raven II surgical robot research platform. The results indicate an improvement in user task performance with the haptic guidance in comparison to unguided teleoperation.
doi_str_mv 10.1109/ICRA.2015.7139943
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_7139943</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>7139943</ieee_id><sourcerecordid>7139943</sourcerecordid><originalsourceid>FETCH-LOGICAL-c332t-b9c758ad909a2b19e04e09986b31b9970002beafaa8ebed60487e9e8af81c7f33</originalsourceid><addsrcrecordid>eNotkN9KwzAUxqMoOOceQLzJC3QmTdvkXI7pdDAQRMG7cdKezLo1GUmn7O2tuKvvd_X9Y-xWiqmUAu6X89fZNBeynGqpAAp1xiagjSw0QAW5Ks7ZKC-1zoTRHxdsJEUpskLncMWuU_oSQihVVSOWZrwOYU8R-_abBvZ9DDvuInb0E-KWuxD5J-77tuabQ9ugr4kHx23boT_gjqdD3LT1AD2mbeIWEzU8eL4ijL71G76IoeMP1AWf-r-U4G_YpcNdoslJx-x98fg2f85WL0_L-WyV1UrlfWah1qXBBgRgbiWQKEgAmMoqaQH0sCG3hA7RkKWmEoXRBGTQGVlrp9SY3f37tkS03sehcjyuT3-pXx0hX1M</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration</title><source>IEEE Xplore All Conference Series</source><creator>Power, Maura ; Rafii-Tari, Hedyeh ; Bergeles, Christos ; Vitiello, Valentina ; Guang-Zhong Yang</creator><creatorcontrib>Power, Maura ; Rafii-Tari, Hedyeh ; Bergeles, Christos ; Vitiello, Valentina ; Guang-Zhong Yang</creatorcontrib><description>Whilst current minimally invasive surgical robots offer many advantages to the surgeon, most of them are still controlled using the traditional master-slave approach, without fully exploiting the complementary strengths of both the human user and the robot. This paper proposes a framework that provides a cooperative control approach to human-robot interaction. Typical teleoperation is enhanced by incorporating haptic guidance-based feedback for surgical tasks, which are demonstrated to and learned by the robot. Safety in the surgical scene is maintained during reproduction of the learned tasks by including the surgeon in the guided execution of the learned task at all times. Continuous Hidden Markov Models are used for task learning, real-time learned task recognition and generating setpoint trajectories for haptic guidance. Two different surgical training tasks were demonstrated and encoded by the system, and the framework was evaluated using the Raven II surgical robot research platform. The results indicate an improvement in user task performance with the haptic guidance in comparison to unguided teleoperation.</description><identifier>ISSN: 1050-4729</identifier><identifier>EISSN: 2577-087X</identifier><identifier>EISBN: 9781479969234</identifier><identifier>EISBN: 1479969230</identifier><identifier>DOI: 10.1109/ICRA.2015.7139943</identifier><language>eng</language><publisher>IEEE</publisher><subject>Encoding ; Haptic interfaces ; Instruments ; Robot kinematics ; Surgery ; Trajectory</subject><ispartof>2015 IEEE International Conference on Robotics and Automation (ICRA), 2015, p.5330-5337</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c332t-b9c758ad909a2b19e04e09986b31b9970002beafaa8ebed60487e9e8af81c7f33</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/7139943$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/7139943$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Power, Maura</creatorcontrib><creatorcontrib>Rafii-Tari, Hedyeh</creatorcontrib><creatorcontrib>Bergeles, Christos</creatorcontrib><creatorcontrib>Vitiello, Valentina</creatorcontrib><creatorcontrib>Guang-Zhong Yang</creatorcontrib><title>A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration</title><title>2015 IEEE International Conference on Robotics and Automation (ICRA)</title><addtitle>ICRA</addtitle><description>Whilst current minimally invasive surgical robots offer many advantages to the surgeon, most of them are still controlled using the traditional master-slave approach, without fully exploiting the complementary strengths of both the human user and the robot. This paper proposes a framework that provides a cooperative control approach to human-robot interaction. Typical teleoperation is enhanced by incorporating haptic guidance-based feedback for surgical tasks, which are demonstrated to and learned by the robot. Safety in the surgical scene is maintained during reproduction of the learned tasks by including the surgeon in the guided execution of the learned task at all times. Continuous Hidden Markov Models are used for task learning, real-time learned task recognition and generating setpoint trajectories for haptic guidance. Two different surgical training tasks were demonstrated and encoded by the system, and the framework was evaluated using the Raven II surgical robot research platform. The results indicate an improvement in user task performance with the haptic guidance in comparison to unguided teleoperation.</description><subject>Encoding</subject><subject>Haptic interfaces</subject><subject>Instruments</subject><subject>Robot kinematics</subject><subject>Surgery</subject><subject>Trajectory</subject><issn>1050-4729</issn><issn>2577-087X</issn><isbn>9781479969234</isbn><isbn>1479969230</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2015</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotkN9KwzAUxqMoOOceQLzJC3QmTdvkXI7pdDAQRMG7cdKezLo1GUmn7O2tuKvvd_X9Y-xWiqmUAu6X89fZNBeynGqpAAp1xiagjSw0QAW5Ks7ZKC-1zoTRHxdsJEUpskLncMWuU_oSQihVVSOWZrwOYU8R-_abBvZ9DDvuInb0E-KWuxD5J-77tuabQ9ugr4kHx23boT_gjqdD3LT1AD2mbeIWEzU8eL4ijL71G76IoeMP1AWf-r-U4G_YpcNdoslJx-x98fg2f85WL0_L-WyV1UrlfWah1qXBBgRgbiWQKEgAmMoqaQH0sCG3hA7RkKWmEoXRBGTQGVlrp9SY3f37tkS03sehcjyuT3-pXx0hX1M</recordid><startdate>201505</startdate><enddate>201505</enddate><creator>Power, Maura</creator><creator>Rafii-Tari, Hedyeh</creator><creator>Bergeles, Christos</creator><creator>Vitiello, Valentina</creator><creator>Guang-Zhong Yang</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>201505</creationdate><title>A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration</title><author>Power, Maura ; Rafii-Tari, Hedyeh ; Bergeles, Christos ; Vitiello, Valentina ; Guang-Zhong Yang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c332t-b9c758ad909a2b19e04e09986b31b9970002beafaa8ebed60487e9e8af81c7f33</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Encoding</topic><topic>Haptic interfaces</topic><topic>Instruments</topic><topic>Robot kinematics</topic><topic>Surgery</topic><topic>Trajectory</topic><toplevel>online_resources</toplevel><creatorcontrib>Power, Maura</creatorcontrib><creatorcontrib>Rafii-Tari, Hedyeh</creatorcontrib><creatorcontrib>Bergeles, Christos</creatorcontrib><creatorcontrib>Vitiello, Valentina</creatorcontrib><creatorcontrib>Guang-Zhong Yang</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Power, Maura</au><au>Rafii-Tari, Hedyeh</au><au>Bergeles, Christos</au><au>Vitiello, Valentina</au><au>Guang-Zhong Yang</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration</atitle><btitle>2015 IEEE International Conference on Robotics and Automation (ICRA)</btitle><stitle>ICRA</stitle><date>2015-05</date><risdate>2015</risdate><spage>5330</spage><epage>5337</epage><pages>5330-5337</pages><issn>1050-4729</issn><eissn>2577-087X</eissn><eisbn>9781479969234</eisbn><eisbn>1479969230</eisbn><abstract>Whilst current minimally invasive surgical robots offer many advantages to the surgeon, most of them are still controlled using the traditional master-slave approach, without fully exploiting the complementary strengths of both the human user and the robot. This paper proposes a framework that provides a cooperative control approach to human-robot interaction. Typical teleoperation is enhanced by incorporating haptic guidance-based feedback for surgical tasks, which are demonstrated to and learned by the robot. Safety in the surgical scene is maintained during reproduction of the learned tasks by including the surgeon in the guided execution of the learned task at all times. Continuous Hidden Markov Models are used for task learning, real-time learned task recognition and generating setpoint trajectories for haptic guidance. Two different surgical training tasks were demonstrated and encoded by the system, and the framework was evaluated using the Raven II surgical robot research platform. The results indicate an improvement in user task performance with the haptic guidance in comparison to unguided teleoperation.</abstract><pub>IEEE</pub><doi>10.1109/ICRA.2015.7139943</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1050-4729
ispartof 2015 IEEE International Conference on Robotics and Automation (ICRA), 2015, p.5330-5337
issn 1050-4729
2577-087X
language eng
recordid cdi_ieee_primary_7139943
source IEEE Xplore All Conference Series
subjects Encoding
Haptic interfaces
Instruments
Robot kinematics
Surgery
Trajectory
title A cooperative control framework for haptic guidance of bimanual surgical tasks based on Learning From Demonstration
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T13%3A45%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=A%20cooperative%20control%20framework%20for%20haptic%20guidance%20of%20bimanual%20surgical%20tasks%20based%20on%20Learning%20From%20Demonstration&rft.btitle=2015%20IEEE%20International%20Conference%20on%20Robotics%20and%20Automation%20(ICRA)&rft.au=Power,%20Maura&rft.date=2015-05&rft.spage=5330&rft.epage=5337&rft.pages=5330-5337&rft.issn=1050-4729&rft.eissn=2577-087X&rft_id=info:doi/10.1109/ICRA.2015.7139943&rft.eisbn=9781479969234&rft.eisbn_list=1479969230&rft_dat=%3Cieee_CHZPO%3E7139943%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c332t-b9c758ad909a2b19e04e09986b31b9970002beafaa8ebed60487e9e8af81c7f33%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=7139943&rfr_iscdi=true