Loading…

MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps

Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map i...

Full description

Saved in:
Bibliographic Details
Main Authors: Wu, Hang, Zhang, Zhenghao, Lin, Siyuan, Mu, Xiangru, Zhao, Qiang, Yang, Ming, Qin, Tong
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 13205
container_issue
container_start_page 13198
container_title
container_volume
creator Wu, Hang
Zhang, Zhenghao
Lin, Siyuan
Mu, Xiangru
Zhao, Qiang
Yang, Ming
Qin, Tong
description Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map is expensive and challenging to scale up. Given these limitations, leveraging navigation maps has emerged as a promising low-cost alternative for localization. Current approaches based on navigation maps can achieve highly accurate localization, but their complex matching strategies lead to unacceptable inference latency that fails to meet the real-time demands. To address these limitations, we introduce MapLocNet, a novel transformer-based neural re-localization method. Inspired by image registration, our approach performs a coarse-to-fine neural feature registration between navigation map features and visual bird's-eye view features. MapLocNet substantially outperforms the current state-of-the-art methods on both nuScenes and Argoverse datasets, demonstrating significant improvements in localization accuracy and inference speed across both single-view and surround-view input settings. We highlight that our research presents an HD-map-free localization method for autonomous driving, offering a costeffective, reliable, and scalable solution for challenging urban environments.
doi_str_mv 10.1109/IROS58592.2024.10802757
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10802757</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10802757</ieee_id><sourcerecordid>10802757</sourcerecordid><originalsourceid>FETCH-LOGICAL-i707-57fc29502bcede6c816ef2724d0fadab6a10cc6af2461caeaf9ced306d4ba8d83</originalsourceid><addsrcrecordid>eNo1UMtKw0AUHQXBUvMHgvmBqXcmmZc7KVYLtYVaXAnlZnKnjMSmZFJBv76B6upwnovD2J2AiRDg7ufr1ZuyysmJBFlOBFiQRpkLljnjbKGgMMaAumQjKVTBwWp9zbKUPgFAwBBxesQ-XvGwaP2S-od82mKXiPctn8U95TPC_thRvqZdTH2HfWz3eWi7_D2mIzaDzocmNvH3bMV9vsTvuDuzYTfdsKuATaLsD8dsM3vaTF_4YvU8nz4ueDRguDLBS6dAVp5q0t4KTUEaWdYQsMZKowDvNQZZauGRMLghWICuywptbYsxuz3PRiLaHrr4hd3P9v-P4gQLdVaM</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps</title><source>IEEE Xplore All Conference Series</source><creator>Wu, Hang ; Zhang, Zhenghao ; Lin, Siyuan ; Mu, Xiangru ; Zhao, Qiang ; Yang, Ming ; Qin, Tong</creator><creatorcontrib>Wu, Hang ; Zhang, Zhenghao ; Lin, Siyuan ; Mu, Xiangru ; Zhao, Qiang ; Yang, Ming ; Qin, Tong</creatorcontrib><description>Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map is expensive and challenging to scale up. Given these limitations, leveraging navigation maps has emerged as a promising low-cost alternative for localization. Current approaches based on navigation maps can achieve highly accurate localization, but their complex matching strategies lead to unacceptable inference latency that fails to meet the real-time demands. To address these limitations, we introduce MapLocNet, a novel transformer-based neural re-localization method. Inspired by image registration, our approach performs a coarse-to-fine neural feature registration between navigation map features and visual bird's-eye view features. MapLocNet substantially outperforms the current state-of-the-art methods on both nuScenes and Argoverse datasets, demonstrating significant improvements in localization accuracy and inference speed across both single-view and surround-view input settings. We highlight that our research presents an HD-map-free localization method for autonomous driving, offering a costeffective, reliable, and scalable solution for challenging urban environments.</description><identifier>EISSN: 2153-0866</identifier><identifier>EISBN: 9798350377705</identifier><identifier>DOI: 10.1109/IROS58592.2024.10802757</identifier><language>eng</language><publisher>IEEE</publisher><subject>Accuracy ; Autonomous vehicles ; Location awareness ; Navigation ; Real-time systems ; Reliability ; Semantics ; Transformers ; Urban areas ; Visualization</subject><ispartof>Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems, 2024, p.13198-13205</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10802757$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,777,781,786,787,27906,54536,54913</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10802757$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wu, Hang</creatorcontrib><creatorcontrib>Zhang, Zhenghao</creatorcontrib><creatorcontrib>Lin, Siyuan</creatorcontrib><creatorcontrib>Mu, Xiangru</creatorcontrib><creatorcontrib>Zhao, Qiang</creatorcontrib><creatorcontrib>Yang, Ming</creatorcontrib><creatorcontrib>Qin, Tong</creatorcontrib><title>MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps</title><title>Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems</title><addtitle>IROS</addtitle><description>Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map is expensive and challenging to scale up. Given these limitations, leveraging navigation maps has emerged as a promising low-cost alternative for localization. Current approaches based on navigation maps can achieve highly accurate localization, but their complex matching strategies lead to unacceptable inference latency that fails to meet the real-time demands. To address these limitations, we introduce MapLocNet, a novel transformer-based neural re-localization method. Inspired by image registration, our approach performs a coarse-to-fine neural feature registration between navigation map features and visual bird's-eye view features. MapLocNet substantially outperforms the current state-of-the-art methods on both nuScenes and Argoverse datasets, demonstrating significant improvements in localization accuracy and inference speed across both single-view and surround-view input settings. We highlight that our research presents an HD-map-free localization method for autonomous driving, offering a costeffective, reliable, and scalable solution for challenging urban environments.</description><subject>Accuracy</subject><subject>Autonomous vehicles</subject><subject>Location awareness</subject><subject>Navigation</subject><subject>Real-time systems</subject><subject>Reliability</subject><subject>Semantics</subject><subject>Transformers</subject><subject>Urban areas</subject><subject>Visualization</subject><issn>2153-0866</issn><isbn>9798350377705</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2024</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1UMtKw0AUHQXBUvMHgvmBqXcmmZc7KVYLtYVaXAnlZnKnjMSmZFJBv76B6upwnovD2J2AiRDg7ufr1ZuyysmJBFlOBFiQRpkLljnjbKGgMMaAumQjKVTBwWp9zbKUPgFAwBBxesQ-XvGwaP2S-od82mKXiPctn8U95TPC_thRvqZdTH2HfWz3eWi7_D2mIzaDzocmNvH3bMV9vsTvuDuzYTfdsKuATaLsD8dsM3vaTF_4YvU8nz4ueDRguDLBS6dAVp5q0t4KTUEaWdYQsMZKowDvNQZZauGRMLghWICuywptbYsxuz3PRiLaHrr4hd3P9v-P4gQLdVaM</recordid><startdate>20241014</startdate><enddate>20241014</enddate><creator>Wu, Hang</creator><creator>Zhang, Zhenghao</creator><creator>Lin, Siyuan</creator><creator>Mu, Xiangru</creator><creator>Zhao, Qiang</creator><creator>Yang, Ming</creator><creator>Qin, Tong</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>20241014</creationdate><title>MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps</title><author>Wu, Hang ; Zhang, Zhenghao ; Lin, Siyuan ; Mu, Xiangru ; Zhao, Qiang ; Yang, Ming ; Qin, Tong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i707-57fc29502bcede6c816ef2724d0fadab6a10cc6af2461caeaf9ced306d4ba8d83</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Autonomous vehicles</topic><topic>Location awareness</topic><topic>Navigation</topic><topic>Real-time systems</topic><topic>Reliability</topic><topic>Semantics</topic><topic>Transformers</topic><topic>Urban areas</topic><topic>Visualization</topic><toplevel>online_resources</toplevel><creatorcontrib>Wu, Hang</creatorcontrib><creatorcontrib>Zhang, Zhenghao</creatorcontrib><creatorcontrib>Lin, Siyuan</creatorcontrib><creatorcontrib>Mu, Xiangru</creatorcontrib><creatorcontrib>Zhao, Qiang</creatorcontrib><creatorcontrib>Yang, Ming</creatorcontrib><creatorcontrib>Qin, Tong</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wu, Hang</au><au>Zhang, Zhenghao</au><au>Lin, Siyuan</au><au>Mu, Xiangru</au><au>Zhao, Qiang</au><au>Yang, Ming</au><au>Qin, Tong</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps</atitle><btitle>Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems</btitle><stitle>IROS</stitle><date>2024-10-14</date><risdate>2024</risdate><spage>13198</spage><epage>13205</epage><pages>13198-13205</pages><eissn>2153-0866</eissn><eisbn>9798350377705</eisbn><abstract>Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map is expensive and challenging to scale up. Given these limitations, leveraging navigation maps has emerged as a promising low-cost alternative for localization. Current approaches based on navigation maps can achieve highly accurate localization, but their complex matching strategies lead to unacceptable inference latency that fails to meet the real-time demands. To address these limitations, we introduce MapLocNet, a novel transformer-based neural re-localization method. Inspired by image registration, our approach performs a coarse-to-fine neural feature registration between navigation map features and visual bird's-eye view features. MapLocNet substantially outperforms the current state-of-the-art methods on both nuScenes and Argoverse datasets, demonstrating significant improvements in localization accuracy and inference speed across both single-view and surround-view input settings. We highlight that our research presents an HD-map-free localization method for autonomous driving, offering a costeffective, reliable, and scalable solution for challenging urban environments.</abstract><pub>IEEE</pub><doi>10.1109/IROS58592.2024.10802757</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2153-0866
ispartof Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems, 2024, p.13198-13205
issn 2153-0866
language eng
recordid cdi_ieee_primary_10802757
source IEEE Xplore All Conference Series
subjects Accuracy
Autonomous vehicles
Location awareness
Navigation
Real-time systems
Reliability
Semantics
Transformers
Urban areas
Visualization
title MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T12%3A26%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=MapLocNet:%20Coarse-to-Fine%20Feature%20Registration%20for%20Visual%20Re-Localization%20in%20Navigation%20Maps&rft.btitle=Proceedings%20of%20the%20...%20IEEE/RSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems&rft.au=Wu,%20Hang&rft.date=2024-10-14&rft.spage=13198&rft.epage=13205&rft.pages=13198-13205&rft.eissn=2153-0866&rft_id=info:doi/10.1109/IROS58592.2024.10802757&rft.eisbn=9798350377705&rft_dat=%3Cieee_CHZPO%3E10802757%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i707-57fc29502bcede6c816ef2724d0fadab6a10cc6af2461caeaf9ced306d4ba8d83%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10802757&rfr_iscdi=true