Loading…
Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People
Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very f...
Saved in:
Published in: | Electronics (Basel) 2019-06, Vol.8 (6), p.697 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43 |
---|---|
cites | cdi_FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43 |
container_end_page | |
container_issue | 6 |
container_start_page | 697 |
container_title | Electronics (Basel) |
container_volume | 8 |
creator | Bai, Jinqiang Liu, Zhaoxiang Lin, Yimin Li, Ye Lian, Shiguo Liu, Dijun |
description | Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system. |
doi_str_mv | 10.3390/electronics8060697 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2548436702</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2548436702</sourcerecordid><originalsourceid>FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43</originalsourceid><addsrcrecordid>eNplkEtLw0AUhQdRsNT-AVcDrqM3M5nXspSqhaIuquIqTCY3MiXNxJm00H9vfCwEz-aeAx_nwiHkModrzg3cYItuiKHzLmmQII06IRMGymSGGXb6x5-TWUpbGGVyrjlMyNsr2mirFukm2gO2dO5r2oRIl93Bj5077Ab6hNFhP_jQUdvV9MEe_Lv9jqGhLz7tbdse6WrXWx-xHvHQt3hBzhrbJpz93il5vl1uFvfZ-vFutZivM8e1GDJnEaqiVk6oglVMS11hIRkIoZlAXhknjcg5UxUKVY-0BYlWcnB1Y8AVfEqufnr7GD72mIZyG_axG1-WTBS64FIBGyn2Q7kYUorYlH30OxuPZQ7l14rl_xX5J6elaOg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2548436702</pqid></control><display><type>article</type><title>Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People</title><source>Publicly Available Content Database</source><creator>Bai, Jinqiang ; Liu, Zhaoxiang ; Lin, Yimin ; Li, Ye ; Lian, Shiguo ; Liu, Dijun</creator><creatorcontrib>Bai, Jinqiang ; Liu, Zhaoxiang ; Lin, Yimin ; Li, Ye ; Lian, Shiguo ; Liu, Dijun</creatorcontrib><description>Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.</description><identifier>ISSN: 2079-9292</identifier><identifier>EISSN: 2079-9292</identifier><identifier>DOI: 10.3390/electronics8060697</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Artificial neural networks ; Blindness ; Design ; Eyewear ; Human performance ; Indoor environments ; Inertial platforms ; Navigation systems ; Object recognition ; Obstacle avoidance ; People with disabilities ; Perception ; Semantics ; Sensors ; Smartphones ; Speech recognition ; Visual impairment ; Wearable technology</subject><ispartof>Electronics (Basel), 2019-06, Vol.8 (6), p.697</ispartof><rights>2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43</citedby><cites>FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43</cites><orcidid>0000-0001-8096-4819 ; 0000-0003-4322-6598 ; 0000-0002-1267-0277 ; 0000-0003-4308-7049</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2548436702/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2548436702?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,25731,27901,27902,36989,44566,74869</link.rule.ids></links><search><creatorcontrib>Bai, Jinqiang</creatorcontrib><creatorcontrib>Liu, Zhaoxiang</creatorcontrib><creatorcontrib>Lin, Yimin</creatorcontrib><creatorcontrib>Li, Ye</creatorcontrib><creatorcontrib>Lian, Shiguo</creatorcontrib><creatorcontrib>Liu, Dijun</creatorcontrib><title>Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People</title><title>Electronics (Basel)</title><description>Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.</description><subject>Artificial neural networks</subject><subject>Blindness</subject><subject>Design</subject><subject>Eyewear</subject><subject>Human performance</subject><subject>Indoor environments</subject><subject>Inertial platforms</subject><subject>Navigation systems</subject><subject>Object recognition</subject><subject>Obstacle avoidance</subject><subject>People with disabilities</subject><subject>Perception</subject><subject>Semantics</subject><subject>Sensors</subject><subject>Smartphones</subject><subject>Speech recognition</subject><subject>Visual impairment</subject><subject>Wearable technology</subject><issn>2079-9292</issn><issn>2079-9292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNplkEtLw0AUhQdRsNT-AVcDrqM3M5nXspSqhaIuquIqTCY3MiXNxJm00H9vfCwEz-aeAx_nwiHkModrzg3cYItuiKHzLmmQII06IRMGymSGGXb6x5-TWUpbGGVyrjlMyNsr2mirFukm2gO2dO5r2oRIl93Bj5077Ab6hNFhP_jQUdvV9MEe_Lv9jqGhLz7tbdse6WrXWx-xHvHQt3hBzhrbJpz93il5vl1uFvfZ-vFutZivM8e1GDJnEaqiVk6oglVMS11hIRkIoZlAXhknjcg5UxUKVY-0BYlWcnB1Y8AVfEqufnr7GD72mIZyG_axG1-WTBS64FIBGyn2Q7kYUorYlH30OxuPZQ7l14rl_xX5J6elaOg</recordid><startdate>20190620</startdate><enddate>20190620</enddate><creator>Bai, Jinqiang</creator><creator>Liu, Zhaoxiang</creator><creator>Lin, Yimin</creator><creator>Li, Ye</creator><creator>Lian, Shiguo</creator><creator>Liu, Dijun</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L7M</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0001-8096-4819</orcidid><orcidid>https://orcid.org/0000-0003-4322-6598</orcidid><orcidid>https://orcid.org/0000-0002-1267-0277</orcidid><orcidid>https://orcid.org/0000-0003-4308-7049</orcidid></search><sort><creationdate>20190620</creationdate><title>Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People</title><author>Bai, Jinqiang ; Liu, Zhaoxiang ; Lin, Yimin ; Li, Ye ; Lian, Shiguo ; Liu, Dijun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Artificial neural networks</topic><topic>Blindness</topic><topic>Design</topic><topic>Eyewear</topic><topic>Human performance</topic><topic>Indoor environments</topic><topic>Inertial platforms</topic><topic>Navigation systems</topic><topic>Object recognition</topic><topic>Obstacle avoidance</topic><topic>People with disabilities</topic><topic>Perception</topic><topic>Semantics</topic><topic>Sensors</topic><topic>Smartphones</topic><topic>Speech recognition</topic><topic>Visual impairment</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bai, Jinqiang</creatorcontrib><creatorcontrib>Liu, Zhaoxiang</creatorcontrib><creatorcontrib>Lin, Yimin</creatorcontrib><creatorcontrib>Li, Ye</creatorcontrib><creatorcontrib>Lian, Shiguo</creatorcontrib><creatorcontrib>Liu, Dijun</creatorcontrib><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>test</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Electronics (Basel)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bai, Jinqiang</au><au>Liu, Zhaoxiang</au><au>Lin, Yimin</au><au>Li, Ye</au><au>Lian, Shiguo</au><au>Liu, Dijun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People</atitle><jtitle>Electronics (Basel)</jtitle><date>2019-06-20</date><risdate>2019</risdate><volume>8</volume><issue>6</issue><spage>697</spage><pages>697-</pages><issn>2079-9292</issn><eissn>2079-9292</eissn><abstract>Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/electronics8060697</doi><orcidid>https://orcid.org/0000-0001-8096-4819</orcidid><orcidid>https://orcid.org/0000-0003-4322-6598</orcidid><orcidid>https://orcid.org/0000-0002-1267-0277</orcidid><orcidid>https://orcid.org/0000-0003-4308-7049</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2079-9292 |
ispartof | Electronics (Basel), 2019-06, Vol.8 (6), p.697 |
issn | 2079-9292 2079-9292 |
language | eng |
recordid | cdi_proquest_journals_2548436702 |
source | Publicly Available Content Database |
subjects | Artificial neural networks Blindness Design Eyewear Human performance Indoor environments Inertial platforms Navigation systems Object recognition Obstacle avoidance People with disabilities Perception Semantics Sensors Smartphones Speech recognition Visual impairment Wearable technology |
title | Wearable Travel Aid for Environment Perception and Navigation of Visually Impaired People |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T07%3A06%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Wearable%20Travel%20Aid%20for%20Environment%20Perception%20and%20Navigation%20of%20Visually%20Impaired%20People&rft.jtitle=Electronics%20(Basel)&rft.au=Bai,%20Jinqiang&rft.date=2019-06-20&rft.volume=8&rft.issue=6&rft.spage=697&rft.pages=697-&rft.issn=2079-9292&rft.eissn=2079-9292&rft_id=info:doi/10.3390/electronics8060697&rft_dat=%3Cproquest_cross%3E2548436702%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c385t-cae0b4d7c5742b2868be462055825e3b9c6951327be57dcaea06ea630cdf90c43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2548436702&rft_id=info:pmid/&rfr_iscdi=true |