Loading…

Brain-inspired multimodal hybrid neural network for robot place recognition

Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can...

Full description

Saved in:
Bibliographic Details
Published in:Science robotics 2023-05, Vol.8 (78), p.eabm6996-eabm6996
Main Authors: Yu, Fangwen, Wu, Yujie, Ma, Songchen, Xu, Mingkun, Li, Hongyi, Qu, Huanyu, Song, Chenhang, Wang, Taoyi, Zhao, Rong, Shi, Luping
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23
cites cdi_FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23
container_end_page eabm6996
container_issue 78
container_start_page eabm6996
container_title Science robotics
container_volume 8
creator Yu, Fangwen
Wu, Yujie
Ma, Songchen
Xu, Mingkun
Li, Hongyi
Qu, Huanyu
Song, Chenhang
Wang, Taoyi
Zhao, Rong
Shi, Luping
description Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi-neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.
doi_str_mv 10.1126/scirobotics.abm6996
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2812507566</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2812507566</sourcerecordid><originalsourceid>FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23</originalsourceid><addsrcrecordid>eNpNkMtKxDAUhoMozqDzBIJ06abjSdOepEsdvOGAG12X3KrRtqlJi8zbW51RZnX-A_8FPkLOKCwpzfAyahe88oPTcSlVi2WJB2Se5RzSMud4uKdnZBHjOwBQjgzz7JjMGKeTBDEnj9dBui51XexdsCZpx2ZwrTeySd42KjiTdHYM09fZ4cuHj6T2IfldTvpGapsEq_1r5wbnu1NyVMsm2sXunpCX25vn1X26frp7WF2tU80KGFLFQCFmZS4BlRIllqA5Iq9ZLqA2IHLKCm0QLVghamWZgdzUhdUF41xn7IRcbHv74D9HG4eqdVHbppGd9WOsMkGzAniBOFnZ1qqDjzHYuuqDa2XYVBSqH5DVHshqB3JKne8GRtVa85_5w8a-AUO9c5c</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2812507566</pqid></control><display><type>article</type><title>Brain-inspired multimodal hybrid neural network for robot place recognition</title><source>Alma/SFX Local Collection</source><creator>Yu, Fangwen ; Wu, Yujie ; Ma, Songchen ; Xu, Mingkun ; Li, Hongyi ; Qu, Huanyu ; Song, Chenhang ; Wang, Taoyi ; Zhao, Rong ; Shi, Luping</creator><creatorcontrib>Yu, Fangwen ; Wu, Yujie ; Ma, Songchen ; Xu, Mingkun ; Li, Hongyi ; Qu, Huanyu ; Song, Chenhang ; Wang, Taoyi ; Zhao, Rong ; Shi, Luping</creatorcontrib><description>Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi-neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.</description><identifier>ISSN: 2470-9476</identifier><identifier>EISSN: 2470-9476</identifier><identifier>DOI: 10.1126/scirobotics.abm6996</identifier><identifier>PMID: 37163608</identifier><language>eng</language><publisher>United States</publisher><subject>Algorithms ; Animals ; Brain - physiology ; Humans ; Neural Networks, Computer ; Neurons - physiology ; Robotics - methods</subject><ispartof>Science robotics, 2023-05, Vol.8 (78), p.eabm6996-eabm6996</ispartof><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23</citedby><cites>FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23</cites><orcidid>0000-0002-0389-0032 ; 0000-0001-7031-4185 ; 0000-0002-9829-2202 ; 0000-0003-4329-8735 ; 0000-0002-7913-046X ; 0000-0002-7495-9930 ; 0000-0002-2320-0326 ; 0000-0002-5371-6155 ; 0000-0003-3738-4311 ; 0000-0003-1878-5451</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37163608$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Yu, Fangwen</creatorcontrib><creatorcontrib>Wu, Yujie</creatorcontrib><creatorcontrib>Ma, Songchen</creatorcontrib><creatorcontrib>Xu, Mingkun</creatorcontrib><creatorcontrib>Li, Hongyi</creatorcontrib><creatorcontrib>Qu, Huanyu</creatorcontrib><creatorcontrib>Song, Chenhang</creatorcontrib><creatorcontrib>Wang, Taoyi</creatorcontrib><creatorcontrib>Zhao, Rong</creatorcontrib><creatorcontrib>Shi, Luping</creatorcontrib><title>Brain-inspired multimodal hybrid neural network for robot place recognition</title><title>Science robotics</title><addtitle>Sci Robot</addtitle><description>Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi-neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.</description><subject>Algorithms</subject><subject>Animals</subject><subject>Brain - physiology</subject><subject>Humans</subject><subject>Neural Networks, Computer</subject><subject>Neurons - physiology</subject><subject>Robotics - methods</subject><issn>2470-9476</issn><issn>2470-9476</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNpNkMtKxDAUhoMozqDzBIJ06abjSdOepEsdvOGAG12X3KrRtqlJi8zbW51RZnX-A_8FPkLOKCwpzfAyahe88oPTcSlVi2WJB2Se5RzSMud4uKdnZBHjOwBQjgzz7JjMGKeTBDEnj9dBui51XexdsCZpx2ZwrTeySd42KjiTdHYM09fZ4cuHj6T2IfldTvpGapsEq_1r5wbnu1NyVMsm2sXunpCX25vn1X26frp7WF2tU80KGFLFQCFmZS4BlRIllqA5Iq9ZLqA2IHLKCm0QLVghamWZgdzUhdUF41xn7IRcbHv74D9HG4eqdVHbppGd9WOsMkGzAniBOFnZ1qqDjzHYuuqDa2XYVBSqH5DVHshqB3JKne8GRtVa85_5w8a-AUO9c5c</recordid><startdate>20230517</startdate><enddate>20230517</enddate><creator>Yu, Fangwen</creator><creator>Wu, Yujie</creator><creator>Ma, Songchen</creator><creator>Xu, Mingkun</creator><creator>Li, Hongyi</creator><creator>Qu, Huanyu</creator><creator>Song, Chenhang</creator><creator>Wang, Taoyi</creator><creator>Zhao, Rong</creator><creator>Shi, Luping</creator><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-0389-0032</orcidid><orcidid>https://orcid.org/0000-0001-7031-4185</orcidid><orcidid>https://orcid.org/0000-0002-9829-2202</orcidid><orcidid>https://orcid.org/0000-0003-4329-8735</orcidid><orcidid>https://orcid.org/0000-0002-7913-046X</orcidid><orcidid>https://orcid.org/0000-0002-7495-9930</orcidid><orcidid>https://orcid.org/0000-0002-2320-0326</orcidid><orcidid>https://orcid.org/0000-0002-5371-6155</orcidid><orcidid>https://orcid.org/0000-0003-3738-4311</orcidid><orcidid>https://orcid.org/0000-0003-1878-5451</orcidid></search><sort><creationdate>20230517</creationdate><title>Brain-inspired multimodal hybrid neural network for robot place recognition</title><author>Yu, Fangwen ; Wu, Yujie ; Ma, Songchen ; Xu, Mingkun ; Li, Hongyi ; Qu, Huanyu ; Song, Chenhang ; Wang, Taoyi ; Zhao, Rong ; Shi, Luping</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Animals</topic><topic>Brain - physiology</topic><topic>Humans</topic><topic>Neural Networks, Computer</topic><topic>Neurons - physiology</topic><topic>Robotics - methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yu, Fangwen</creatorcontrib><creatorcontrib>Wu, Yujie</creatorcontrib><creatorcontrib>Ma, Songchen</creatorcontrib><creatorcontrib>Xu, Mingkun</creatorcontrib><creatorcontrib>Li, Hongyi</creatorcontrib><creatorcontrib>Qu, Huanyu</creatorcontrib><creatorcontrib>Song, Chenhang</creatorcontrib><creatorcontrib>Wang, Taoyi</creatorcontrib><creatorcontrib>Zhao, Rong</creatorcontrib><creatorcontrib>Shi, Luping</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Science robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yu, Fangwen</au><au>Wu, Yujie</au><au>Ma, Songchen</au><au>Xu, Mingkun</au><au>Li, Hongyi</au><au>Qu, Huanyu</au><au>Song, Chenhang</au><au>Wang, Taoyi</au><au>Zhao, Rong</au><au>Shi, Luping</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Brain-inspired multimodal hybrid neural network for robot place recognition</atitle><jtitle>Science robotics</jtitle><addtitle>Sci Robot</addtitle><date>2023-05-17</date><risdate>2023</risdate><volume>8</volume><issue>78</issue><spage>eabm6996</spage><epage>eabm6996</epage><pages>eabm6996-eabm6996</pages><issn>2470-9476</issn><eissn>2470-9476</eissn><abstract>Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi-neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.</abstract><cop>United States</cop><pmid>37163608</pmid><doi>10.1126/scirobotics.abm6996</doi><orcidid>https://orcid.org/0000-0002-0389-0032</orcidid><orcidid>https://orcid.org/0000-0001-7031-4185</orcidid><orcidid>https://orcid.org/0000-0002-9829-2202</orcidid><orcidid>https://orcid.org/0000-0003-4329-8735</orcidid><orcidid>https://orcid.org/0000-0002-7913-046X</orcidid><orcidid>https://orcid.org/0000-0002-7495-9930</orcidid><orcidid>https://orcid.org/0000-0002-2320-0326</orcidid><orcidid>https://orcid.org/0000-0002-5371-6155</orcidid><orcidid>https://orcid.org/0000-0003-3738-4311</orcidid><orcidid>https://orcid.org/0000-0003-1878-5451</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2470-9476
ispartof Science robotics, 2023-05, Vol.8 (78), p.eabm6996-eabm6996
issn 2470-9476
2470-9476
language eng
recordid cdi_proquest_miscellaneous_2812507566
source Alma/SFX Local Collection
subjects Algorithms
Animals
Brain - physiology
Humans
Neural Networks, Computer
Neurons - physiology
Robotics - methods
title Brain-inspired multimodal hybrid neural network for robot place recognition
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T17%3A51%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Brain-inspired%20multimodal%20hybrid%20neural%20network%20for%20robot%20place%20recognition&rft.jtitle=Science%20robotics&rft.au=Yu,%20Fangwen&rft.date=2023-05-17&rft.volume=8&rft.issue=78&rft.spage=eabm6996&rft.epage=eabm6996&rft.pages=eabm6996-eabm6996&rft.issn=2470-9476&rft.eissn=2470-9476&rft_id=info:doi/10.1126/scirobotics.abm6996&rft_dat=%3Cproquest_cross%3E2812507566%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c350t-b30b66294a06bb89690c7667f3480fd084135cd66e0e88fbe3d04df5ec5377c23%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2812507566&rft_id=info:pmid/37163608&rfr_iscdi=true