Loading…

Enhancing Indoor Robot Pedestrian Detection Using Improved PIXOR Backbone and Gaussian Heatmap Regression in 3D LiDAR Point Clouds

Accurate and robust pedestrian detection is fundamental for indoor robotic systems to navigate safely and seamlessly alongside humans in spatially constrained, unpredictable indoor environments. This paper presents a novel method, IRBGHR-PIXOR, a detection framework specifically engineered for pedes...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, Vol.12, p.9162-9176
Main Authors: Nguyen, Duy Anh, Hoang, Khang Nguyen, Nguyen, Nguyen Trung, Tran, Hoang Ngoc
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate and robust pedestrian detection is fundamental for indoor robotic systems to navigate safely and seamlessly alongside humans in spatially constrained, unpredictable indoor environments. This paper presents a novel method, IRBGHR-PIXOR, a detection framework specifically engineered for pedestrian perception in indoor mobile robots. This novel approach employs an enhanced adaptation of the cutting-edge PIXOR model, integrating two pivotal augmentations: a remodeled convolutional backbone leveraging Inverted Residual Blocks (IRB) in unison with Gaussian Heatmap Regression (GHR), as well as a Modified Focal Loss (MFL) function to tackle data imbalance issues. The IRB component notably bolsters the network’s aptitude for processing intricate spatial representations generated from sparse 3D LiDAR scans. Meanwhile, integrating GHR further elevates accuracy by enabling precise localization of pedestrian subjects. This is achieved by modeling the probability distribution and predicting the central location of individuals in the point cloud data. Extensively evaluated on the large-scale JRDB dataset comprising intense scans from 16-beam Velodyne LiDAR sensors, IRBGHR-PIXOR accomplishes exceptional results, attaining 97.17% Average Precision (AP) at the 0.5 IOU threshold. Notably, this level of accuracy is achieved without significantly increasing model complexity. By enhancing algorithms to overcome challenges in confined indoor environments, this research paves the way for safe and effective deployment of autonomous technologies once encumbered by perceptual limitations in human-centered spaces. Nonetheless, evaluating performance in diverse edge cases and integration with complementary sensory cues promise continued progress. The developments contribute towards the vital capacity of reliable dynamic perception for next-generation robotic systems coexisting in human-centric environments.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3351868