Loading…

A Modular and Multi-Modal Mapping Framework

Integration of multiple sensor modalities and deep learning into Simultaneous Localization And Mapping (SLAM) systems are areas of significant interest in current research. Multi-modality is a stepping stone towards achieving robustness in challenging environments and interoperability of heterogeneo...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters 2023-01, Vol.8 (2), p.520
Main Authors: Cramariuc, Andrei, Bernreiter, Lukas, Tschopp, Florian, Fehr, Marius, Reijgwart, Victor, Nieto, Juan, Siegwart, Roland, Cadena, Cesar
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Integration of multiple sensor modalities and deep learning into Simultaneous Localization And Mapping (SLAM) systems are areas of significant interest in current research. Multi-modality is a stepping stone towards achieving robustness in challenging environments and interoperability of heterogeneous multi-robot systems with varying sensor setups. With maplab 2.0 , we provide a versatile open-source platform that facilitates developing, testing, and integrating new modules and features into a fully-fledged SLAM system. Through extensive experiments, we show that maplab 2.0 ’s accuracy is comparable to the state-of-the-art on the HILTI 2021 benchmark. Additionally, we showcase the flexibility of our system with three use cases: i) large-scale ([Formula Omitted]10 [Formula Omitted]) multi-robot multi-session (23 missions) mapping, ii) integration of non-visual landmarks, and iii) incorporating a semantic object-based loop closure module into the mapping framework.
ISSN:2377-3766
DOI:10.1109/LRA.2022.3227865