Loading…

MapLocNet: Coarse-to-Fine Feature Registration for Visual Re-Localization in Navigation Maps

Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map i...

Full description

Saved in:
Bibliographic Details
Main Authors: Wu, Hang, Zhang, Zhenghao, Lin, Siyuan, Mu, Xiangru, Zhao, Qiang, Yang, Ming, Qin, Tong
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robust localization is the cornerstone of autonomous driving, especially in challenging urban environments where GPS signals suffer from multipath errors. Traditional localization approaches rely on high-definition (HD) maps, which consist of precisely annotated landmarks. However, building HD map is expensive and challenging to scale up. Given these limitations, leveraging navigation maps has emerged as a promising low-cost alternative for localization. Current approaches based on navigation maps can achieve highly accurate localization, but their complex matching strategies lead to unacceptable inference latency that fails to meet the real-time demands. To address these limitations, we introduce MapLocNet, a novel transformer-based neural re-localization method. Inspired by image registration, our approach performs a coarse-to-fine neural feature registration between navigation map features and visual bird's-eye view features. MapLocNet substantially outperforms the current state-of-the-art methods on both nuScenes and Argoverse datasets, demonstrating significant improvements in localization accuracy and inference speed across both single-view and surround-view input settings. We highlight that our research presents an HD-map-free localization method for autonomous driving, offering a costeffective, reliable, and scalable solution for challenging urban environments.
ISSN:2153-0866
DOI:10.1109/IROS58592.2024.10802757