Loading…
FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space
This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision ana...
Saved in:
Published in: | IEEE transactions on haptics 2020-04, Vol.13 (2), p.325-333 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53 |
---|---|
cites | cdi_FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53 |
container_end_page | 333 |
container_issue | 2 |
container_start_page | 325 |
container_title | IEEE transactions on haptics |
container_volume | 13 |
creator | Satpute, Shantanu A. Canady, Janet R. Klatzky, Roberta L. Stetten, George D. |
description | This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence. |
doi_str_mv | 10.1109/TOH.2019.2945561 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2305043907</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8861082</ieee_id><sourcerecordid>2414533934</sourcerecordid><originalsourceid>FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53</originalsourceid><addsrcrecordid>eNpdkc1LAzEQxYMoWKt3wUvAi5et-dzdeCvFWqFQaasel2x2tk3Z7rZJevC_N6XFg6eZ4f3eMMND6J6SAaVEPS9nkwEjVA2YElKm9AL1GGcqEUTSS9SjiquECsqu0Y33G0JSlinRQ_uxbVfgFna1Di94iL9s6bqgTbAN4G_QTpexmUcI153DQ--tD7o1UbRhjaed0eEo6rbCc9BmfRxm5QZM8Ni2-AOc3YHzXasbvNhpA7foqtaNh7tz7aPP8etyNEmms7f30XCaGJ6mIcmhypXI6qyUqQZVVyyXtZIZpdKYrOJ1TUEqAzoXmRBVGv9TZSmYURKENJL30dNp7851-wP4UGytN9A0uoXu4AvGiSSCK5JF9PEfuukOLl4cKUGF5FxxESlyoozrvHdQFztnt9r9FJQUxwyKmEFxzKA4ZxAtDyeLBYA_PM9TSnLGfwGv_YJG</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2414533934</pqid></control><display><type>article</type><title>FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Satpute, Shantanu A. ; Canady, Janet R. ; Klatzky, Roberta L. ; Stetten, George D.</creator><creatorcontrib>Satpute, Shantanu A. ; Canady, Janet R. ; Klatzky, Roberta L. ; Stetten, George D.</creatorcontrib><description>This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence.</description><identifier>ISSN: 1939-1412</identifier><identifier>EISSN: 2329-4051</identifier><identifier>DOI: 10.1109/TOH.2019.2945561</identifier><identifier>CODEN: ITHEBX</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Assistive technology ; Axes (reference lines) ; blind ; Cameras ; Fingers ; guidance ; Haptic interfaces ; Light emitting diodes ; peripersonal space ; Prototypes ; Tracking ; Vibrations ; Vision</subject><ispartof>IEEE transactions on haptics, 2020-04, Vol.13 (2), p.325-333</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53</citedby><cites>FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53</cites><orcidid>0000-0002-6788-5769 ; 0000-0001-9701-9186 ; 0000-0003-0300-8748</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8861082$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Satpute, Shantanu A.</creatorcontrib><creatorcontrib>Canady, Janet R.</creatorcontrib><creatorcontrib>Klatzky, Roberta L.</creatorcontrib><creatorcontrib>Stetten, George D.</creatorcontrib><title>FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space</title><title>IEEE transactions on haptics</title><addtitle>TOH</addtitle><description>This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence.</description><subject>Assistive technology</subject><subject>Axes (reference lines)</subject><subject>blind</subject><subject>Cameras</subject><subject>Fingers</subject><subject>guidance</subject><subject>Haptic interfaces</subject><subject>Light emitting diodes</subject><subject>peripersonal space</subject><subject>Prototypes</subject><subject>Tracking</subject><subject>Vibrations</subject><subject>Vision</subject><issn>1939-1412</issn><issn>2329-4051</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNpdkc1LAzEQxYMoWKt3wUvAi5et-dzdeCvFWqFQaasel2x2tk3Z7rZJevC_N6XFg6eZ4f3eMMND6J6SAaVEPS9nkwEjVA2YElKm9AL1GGcqEUTSS9SjiquECsqu0Y33G0JSlinRQ_uxbVfgFna1Di94iL9s6bqgTbAN4G_QTpexmUcI153DQ--tD7o1UbRhjaed0eEo6rbCc9BmfRxm5QZM8Ni2-AOc3YHzXasbvNhpA7foqtaNh7tz7aPP8etyNEmms7f30XCaGJ6mIcmhypXI6qyUqQZVVyyXtZIZpdKYrOJ1TUEqAzoXmRBVGv9TZSmYURKENJL30dNp7851-wP4UGytN9A0uoXu4AvGiSSCK5JF9PEfuukOLl4cKUGF5FxxESlyoozrvHdQFztnt9r9FJQUxwyKmEFxzKA4ZxAtDyeLBYA_PM9TSnLGfwGv_YJG</recordid><startdate>20200401</startdate><enddate>20200401</enddate><creator>Satpute, Shantanu A.</creator><creator>Canady, Janet R.</creator><creator>Klatzky, Roberta L.</creator><creator>Stetten, George D.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-6788-5769</orcidid><orcidid>https://orcid.org/0000-0001-9701-9186</orcidid><orcidid>https://orcid.org/0000-0003-0300-8748</orcidid></search><sort><creationdate>20200401</creationdate><title>FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space</title><author>Satpute, Shantanu A. ; Canady, Janet R. ; Klatzky, Roberta L. ; Stetten, George D.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Assistive technology</topic><topic>Axes (reference lines)</topic><topic>blind</topic><topic>Cameras</topic><topic>Fingers</topic><topic>guidance</topic><topic>Haptic interfaces</topic><topic>Light emitting diodes</topic><topic>peripersonal space</topic><topic>Prototypes</topic><topic>Tracking</topic><topic>Vibrations</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Satpute, Shantanu A.</creatorcontrib><creatorcontrib>Canady, Janet R.</creatorcontrib><creatorcontrib>Klatzky, Roberta L.</creatorcontrib><creatorcontrib>Stetten, George D.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on haptics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Satpute, Shantanu A.</au><au>Canady, Janet R.</au><au>Klatzky, Roberta L.</au><au>Stetten, George D.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space</atitle><jtitle>IEEE transactions on haptics</jtitle><stitle>TOH</stitle><date>2020-04-01</date><risdate>2020</risdate><volume>13</volume><issue>2</issue><spage>325</spage><epage>333</epage><pages>325-333</pages><issn>1939-1412</issn><eissn>2329-4051</eissn><coden>ITHEBX</coden><abstract>This paper describes a prototype guidance system, "FingerSight," to help people without vision locate and reach to objects in peripersonal space. It consists of four evenly spaced tactors embedded into a ring worn on the index finger, with a small camera mounted on top. Computer-vision analysis of the camera image controls vibrotactile feedback, leading users to move their hand to near targets. Two experiments tested the functionality of the prototype system. The first found that participants could discriminate between five different vibrotactile sites (four individual tactors and all simultaneously) with a mean accuracy of 88.8% after initial training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to one of four locations within arm's reach, while hand trajectories were tracked. The tactors were controlled using two different strategies: (1) repeatedly signal axis with largest error, and (2) signal both axes in alternation. Participants demonstrated essentially straight-line trajectories toward the target under both instructions, but the temporal parameters (rate of approach, duration) showed an advantage for correction on both axes in sequence.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TOH.2019.2945561</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0002-6788-5769</orcidid><orcidid>https://orcid.org/0000-0001-9701-9186</orcidid><orcidid>https://orcid.org/0000-0003-0300-8748</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1939-1412 |
ispartof | IEEE transactions on haptics, 2020-04, Vol.13 (2), p.325-333 |
issn | 1939-1412 2329-4051 |
language | eng |
recordid | cdi_proquest_miscellaneous_2305043907 |
source | IEEE Electronic Library (IEL) Journals |
subjects | Assistive technology Axes (reference lines) blind Cameras Fingers guidance Haptic interfaces Light emitting diodes peripersonal space Prototypes Tracking Vibrations Vision |
title | FingerSight: A Vibrotactile Wearable Ring for Assistance With Locating and Reaching Objects in Peripersonal Space |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T01%3A42%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=FingerSight:%20A%20Vibrotactile%20Wearable%20Ring%20for%20Assistance%20With%20Locating%20and%20Reaching%20Objects%20in%20Peripersonal%20Space&rft.jtitle=IEEE%20transactions%20on%20haptics&rft.au=Satpute,%20Shantanu%20A.&rft.date=2020-04-01&rft.volume=13&rft.issue=2&rft.spage=325&rft.epage=333&rft.pages=325-333&rft.issn=1939-1412&rft.eissn=2329-4051&rft.coden=ITHEBX&rft_id=info:doi/10.1109/TOH.2019.2945561&rft_dat=%3Cproquest_cross%3E2414533934%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c366t-8ed8947f7b56ae9fd285f957115cc7d3ff1e59cea84744d69399bb42c95e45c53%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2414533934&rft_id=info:pmid/&rft_ieee_id=8861082&rfr_iscdi=true |