Loading…
Robust 6-D Pose Estimation of the UAV Based on Hybrid Features
Unmanned aerial vehicle (UAVs) of various shapes and sizes have rapidly emerged, creating an urgent need for a highly transferable, fast, and robust 6-D pose estimation method for visual tasks, such as UAV swarming and monitoring. To address this, this article presents a robust method for estimating...
Saved in:
Published in: | IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-13 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Unmanned aerial vehicle (UAVs) of various shapes and sizes have rapidly emerged, creating an urgent need for a highly transferable, fast, and robust 6-D pose estimation method for visual tasks, such as UAV swarming and monitoring. To address this, this article presents a robust method for estimating the 6-D pose of a UAV relative to an airborne camera based on hybrid features. First, we employ KAPAO3D to simultaneously obtain pose objects using 3-D losses with implicit keypoint associations and keypoint objects predicted based on explicit semantic information. Leveraging prior pose object knowledge enables the establishment of precise 2-D-3-D correspondences, facilitating the creation of fusion objects that alleviate missed detections and false positives in keypoint object predictions. Second, pose objects and fusion objects are, respectively, utilized to construct point and edge constraints to obtain an initial pose. We minimize the 3-D errors of hybrid features with predicted confidence scores to further improve the accuracy. Experimental results demonstrate that our method adapts to various types of UAVs and achieves superior performance, with higher ADD and Te metrics compared with other methods and a running speed of 30 frames/s. This method is expected to have potential practical applications in high-precision real-time tracking tasks. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2024.3481530 |