Loading…
DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition
Telemanipulation of deformable objects requires high precision and dexterity from the users, which can be increased by kinesthetic and tactile feedback. However, the object shape can change dynamically, causing ambiguous perception of its alignment and hence errors in the robot positioning. Therefor...
Saved in:
Main Authors: | , , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 6 |
container_issue | |
container_start_page | 1 |
container_title | |
container_volume | |
creator | Cabrera, Miguel Altamirano Sautenkov, Oleg Tirado, Jonathan Fedoseev, Aleksey Kopanev, Pavel Kajimoto, Hiroyuki Tsetserukou, Dzmitry |
description | Telemanipulation of deformable objects requires high precision and dexterity from the users, which can be increased by kinesthetic and tactile feedback. However, the object shape can change dynamically, causing ambiguous perception of its alignment and hence errors in the robot positioning. Therefore, recognize the tilt angle and position patterns sensed over the gripper fingertip is a classification problem that has to be solved to present a clear tactile pattern to the user. This work presents a telemanipulation system for plastic pipettes consisting of a multi-contact haptic interface LinkGlide to deliver haptic feedback at the users' palm and two tactile sensors array embedded in the 2-finger Robotiq gripper. We propose a novel approach based on Convolutional Neural Networks (CNN) to detect the tilt and position while grasping deformable objects. The CNN generates a mask based on recognized tilt and position data to render further multi-contact tactile stimuli provided to the user during the telemanipulation. The study has shown that using the CNN algorithm and the preset mask, tilt, and position recognition by users is increased from 9.67% using the direct data to 82.5%. |
doi_str_mv | 10.1109/HAPTICS52432.2022.9765571 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_9765571</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9765571</ieee_id><sourcerecordid>9765571</sourcerecordid><originalsourceid>FETCH-LOGICAL-i203t-d80764e840f1892a0543ab0be126f3e647caa64f24fa7629a675d917f1076d733</originalsourceid><addsrcrecordid>eNotkM9Kw0AYxFdBsNQ-gZf1AVL3_2a9lVRNodSiEbyVr8mXspImIbsifXvb2ssMDDO_wxDywNmUc-Ye89m6WGQfWigppoIJMXXWaG35FZk4m3JjtDrGzl6TkZBCJVZqfUsmIXwzxriVxigxIt0csf9aQ7N_ooVvIoW2ousu-Oi7lr5jW-Hg2x39CSc99ZLfbmhpDn30JZ370DdwOK-y1SrZQsCKFlBG3-CxHiMOJ0zZ7doz8o7c1NAEnFx8TD5fnossT5Zvr4tstky8YDImVcqsUZgqVvPUCWBaSdiyLXJhaolG2RLAqFqoGqwRDozVleO25sddZaUck_t_rkfETT_4PQyHzeUi-QeSGVur</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition</title><source>IEEE Xplore All Conference Series</source><creator>Cabrera, Miguel Altamirano ; Sautenkov, Oleg ; Tirado, Jonathan ; Fedoseev, Aleksey ; Kopanev, Pavel ; Kajimoto, Hiroyuki ; Tsetserukou, Dzmitry</creator><creatorcontrib>Cabrera, Miguel Altamirano ; Sautenkov, Oleg ; Tirado, Jonathan ; Fedoseev, Aleksey ; Kopanev, Pavel ; Kajimoto, Hiroyuki ; Tsetserukou, Dzmitry</creatorcontrib><description>Telemanipulation of deformable objects requires high precision and dexterity from the users, which can be increased by kinesthetic and tactile feedback. However, the object shape can change dynamically, causing ambiguous perception of its alignment and hence errors in the robot positioning. Therefore, recognize the tilt angle and position patterns sensed over the gripper fingertip is a classification problem that has to be solved to present a clear tactile pattern to the user. This work presents a telemanipulation system for plastic pipettes consisting of a multi-contact haptic interface LinkGlide to deliver haptic feedback at the users' palm and two tactile sensors array embedded in the 2-finger Robotiq gripper. We propose a novel approach based on Convolutional Neural Networks (CNN) to detect the tilt and position while grasping deformable objects. The CNN generates a mask based on recognized tilt and position data to render further multi-contact tactile stimuli provided to the user during the telemanipulation. The study has shown that using the CNN algorithm and the preset mask, tilt, and position recognition by users is increased from 9.67% using the direct data to 82.5%.</description><identifier>EISSN: 2324-7355</identifier><identifier>EISBN: 9781665420297</identifier><identifier>EISBN: 1665420294</identifier><identifier>DOI: 10.1109/HAPTICS52432.2022.9765571</identifier><language>eng</language><publisher>IEEE</publisher><subject>Convolutional neural networks ; Grasping ; Pattern recognition ; Plastics ; Rendering (computer graphics) ; Shape ; Tactile sensors</subject><ispartof>2022 IEEE Haptics Symposium (HAPTICS), 2022, p.1-6</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9765571$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9765571$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Cabrera, Miguel Altamirano</creatorcontrib><creatorcontrib>Sautenkov, Oleg</creatorcontrib><creatorcontrib>Tirado, Jonathan</creatorcontrib><creatorcontrib>Fedoseev, Aleksey</creatorcontrib><creatorcontrib>Kopanev, Pavel</creatorcontrib><creatorcontrib>Kajimoto, Hiroyuki</creatorcontrib><creatorcontrib>Tsetserukou, Dzmitry</creatorcontrib><title>DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition</title><title>2022 IEEE Haptics Symposium (HAPTICS)</title><addtitle>HAPTICS</addtitle><description>Telemanipulation of deformable objects requires high precision and dexterity from the users, which can be increased by kinesthetic and tactile feedback. However, the object shape can change dynamically, causing ambiguous perception of its alignment and hence errors in the robot positioning. Therefore, recognize the tilt angle and position patterns sensed over the gripper fingertip is a classification problem that has to be solved to present a clear tactile pattern to the user. This work presents a telemanipulation system for plastic pipettes consisting of a multi-contact haptic interface LinkGlide to deliver haptic feedback at the users' palm and two tactile sensors array embedded in the 2-finger Robotiq gripper. We propose a novel approach based on Convolutional Neural Networks (CNN) to detect the tilt and position while grasping deformable objects. The CNN generates a mask based on recognized tilt and position data to render further multi-contact tactile stimuli provided to the user during the telemanipulation. The study has shown that using the CNN algorithm and the preset mask, tilt, and position recognition by users is increased from 9.67% using the direct data to 82.5%.</description><subject>Convolutional neural networks</subject><subject>Grasping</subject><subject>Pattern recognition</subject><subject>Plastics</subject><subject>Rendering (computer graphics)</subject><subject>Shape</subject><subject>Tactile sensors</subject><issn>2324-7355</issn><isbn>9781665420297</isbn><isbn>1665420294</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2022</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotkM9Kw0AYxFdBsNQ-gZf1AVL3_2a9lVRNodSiEbyVr8mXspImIbsifXvb2ssMDDO_wxDywNmUc-Ye89m6WGQfWigppoIJMXXWaG35FZk4m3JjtDrGzl6TkZBCJVZqfUsmIXwzxriVxigxIt0csf9aQ7N_ooVvIoW2ousu-Oi7lr5jW-Hg2x39CSc99ZLfbmhpDn30JZ370DdwOK-y1SrZQsCKFlBG3-CxHiMOJ0zZ7doz8o7c1NAEnFx8TD5fnossT5Zvr4tstky8YDImVcqsUZgqVvPUCWBaSdiyLXJhaolG2RLAqFqoGqwRDozVleO25sddZaUck_t_rkfETT_4PQyHzeUi-QeSGVur</recordid><startdate>20220321</startdate><enddate>20220321</enddate><creator>Cabrera, Miguel Altamirano</creator><creator>Sautenkov, Oleg</creator><creator>Tirado, Jonathan</creator><creator>Fedoseev, Aleksey</creator><creator>Kopanev, Pavel</creator><creator>Kajimoto, Hiroyuki</creator><creator>Tsetserukou, Dzmitry</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20220321</creationdate><title>DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition</title><author>Cabrera, Miguel Altamirano ; Sautenkov, Oleg ; Tirado, Jonathan ; Fedoseev, Aleksey ; Kopanev, Pavel ; Kajimoto, Hiroyuki ; Tsetserukou, Dzmitry</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i203t-d80764e840f1892a0543ab0be126f3e647caa64f24fa7629a675d917f1076d733</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Convolutional neural networks</topic><topic>Grasping</topic><topic>Pattern recognition</topic><topic>Plastics</topic><topic>Rendering (computer graphics)</topic><topic>Shape</topic><topic>Tactile sensors</topic><toplevel>online_resources</toplevel><creatorcontrib>Cabrera, Miguel Altamirano</creatorcontrib><creatorcontrib>Sautenkov, Oleg</creatorcontrib><creatorcontrib>Tirado, Jonathan</creatorcontrib><creatorcontrib>Fedoseev, Aleksey</creatorcontrib><creatorcontrib>Kopanev, Pavel</creatorcontrib><creatorcontrib>Kajimoto, Hiroyuki</creatorcontrib><creatorcontrib>Tsetserukou, Dzmitry</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Cabrera, Miguel Altamirano</au><au>Sautenkov, Oleg</au><au>Tirado, Jonathan</au><au>Fedoseev, Aleksey</au><au>Kopanev, Pavel</au><au>Kajimoto, Hiroyuki</au><au>Tsetserukou, Dzmitry</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition</atitle><btitle>2022 IEEE Haptics Symposium (HAPTICS)</btitle><stitle>HAPTICS</stitle><date>2022-03-21</date><risdate>2022</risdate><spage>1</spage><epage>6</epage><pages>1-6</pages><eissn>2324-7355</eissn><eisbn>9781665420297</eisbn><eisbn>1665420294</eisbn><abstract>Telemanipulation of deformable objects requires high precision and dexterity from the users, which can be increased by kinesthetic and tactile feedback. However, the object shape can change dynamically, causing ambiguous perception of its alignment and hence errors in the robot positioning. Therefore, recognize the tilt angle and position patterns sensed over the gripper fingertip is a classification problem that has to be solved to present a clear tactile pattern to the user. This work presents a telemanipulation system for plastic pipettes consisting of a multi-contact haptic interface LinkGlide to deliver haptic feedback at the users' palm and two tactile sensors array embedded in the 2-finger Robotiq gripper. We propose a novel approach based on Convolutional Neural Networks (CNN) to detect the tilt and position while grasping deformable objects. The CNN generates a mask based on recognized tilt and position data to render further multi-contact tactile stimuli provided to the user during the telemanipulation. The study has shown that using the CNN algorithm and the preset mask, tilt, and position recognition by users is increased from 9.67% using the direct data to 82.5%.</abstract><pub>IEEE</pub><doi>10.1109/HAPTICS52432.2022.9765571</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 2324-7355 |
ispartof | 2022 IEEE Haptics Symposium (HAPTICS), 2022, p.1-6 |
issn | 2324-7355 |
language | eng |
recordid | cdi_ieee_primary_9765571 |
source | IEEE Xplore All Conference Series |
subjects | Convolutional neural networks Grasping Pattern recognition Plastics Rendering (computer graphics) Shape Tactile sensors |
title | DeepXPalm: Tilt and Position Rendering using Palm-worn Haptic Display and CNN-based Tactile Pattern Recognition |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T03%3A35%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=DeepXPalm:%20Tilt%20and%20Position%20Rendering%20using%20Palm-worn%20Haptic%20Display%20and%20CNN-based%20Tactile%20Pattern%20Recognition&rft.btitle=2022%20IEEE%20Haptics%20Symposium%20(HAPTICS)&rft.au=Cabrera,%20Miguel%20Altamirano&rft.date=2022-03-21&rft.spage=1&rft.epage=6&rft.pages=1-6&rft.eissn=2324-7355&rft_id=info:doi/10.1109/HAPTICS52432.2022.9765571&rft.eisbn=9781665420297&rft.eisbn_list=1665420294&rft_dat=%3Cieee_CHZPO%3E9765571%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i203t-d80764e840f1892a0543ab0be126f3e647caa64f24fa7629a675d917f1076d733%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=9765571&rfr_iscdi=true |