Loading…

A Spatial Calibration Method for Robust Cooperative Perception

Cooperative perception is a promising technique for intelligent and connected vehicles through vehicle-to-everything (V2X) cooperation, provided that accurate pose information and relative pose transforms are available. Nevertheless, obtaining precise positioning information often entails high costs...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters 2024-05, Vol.9 (5), p.4011-4018
Main Authors: Song, Zhiying, Xie, Tenghui, Zhang, Hailiang, Liu, Jiaxin, Wen, Fuxi, Li, Jun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cooperative perception is a promising technique for intelligent and connected vehicles through vehicle-to-everything (V2X) cooperation, provided that accurate pose information and relative pose transforms are available. Nevertheless, obtaining precise positioning information often entails high costs associated with navigation systems. Hence, it is required to calibrate relative pose information for multi-agent cooperative perception. This letter proposes a simple but effective object association approach named context-based matching (\mathtt{CBM}), which identifies inter-agent object correspondences using intra-agent geometrical context. In detail, this method constructs contexts using the relative position of the detected bounding boxes, followed by local context matching and global consensus maximization. The optimal relative pose transform is estimated based on the matched correspondences, followed by cooperative perception fusion. Extensive experiments are conducted on both the simulated and real-world datasets. Even with larger inter-agent localization errors, high object association precision and decimeter-level relative pose calibration accuracy are achieved among the cooperating agents.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2024.3374168