Loading…

PLI-SLAM: A Tightly-Coupled Stereo Visual-Inertial SLAM System with Point and Line Features

Point feature-based visual simultaneous localization and mapping (SLAM) systems are prone to performance degradation in low-texture environments due to insufficient extraction of point features. In this paper, we propose a tightly-coupled stereo visual-inertial SLAM system with point and line featur...

Full description

Saved in:
Bibliographic Details
Published in:Remote sensing (Basel, Switzerland) Switzerland), 2023-10, Vol.15 (19), p.4678
Main Authors: Teng, Zhaoyu, Han, Bin, Cao, Jie, Hao, Qun, Tang, Xin, Li, Zhaoyang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Point feature-based visual simultaneous localization and mapping (SLAM) systems are prone to performance degradation in low-texture environments due to insufficient extraction of point features. In this paper, we propose a tightly-coupled stereo visual-inertial SLAM system with point and line features (PLI-SLAM) to enhance the robustness and reliability of systems in low-texture environments. We improve Edge Drawing lines (EDlines) for line feature detection by introducing curvature detection and a new standard for minimum line segment length to improve the accuracy of the line features, while reducing the line feature detection time. We contribute also with an improved adapting factor based on experiment to adjust the error weight of line features, which further improves the localization accuracy of the system. Our system has been tested on the EuRoC dataset. Tests on public datasets and in real environments have shown that PLI-SLAM achieves high accuracy. Furthermore, PLI-SLAM could still operate robustly even in some challenging environments. The processing time of our method is reduced by 28%, compared to the ORB-LINE-SLAM based on point and line, when using Line Segment Detector (LSD).
ISSN:2072-4292
2072-4292
DOI:10.3390/rs15194678