Loading…

OLOD: a new UAV dataset and benchmark for single tiny object tracking

The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contou...

Full description

Saved in:
Bibliographic Details
Published in:International journal of remote sensing 2024-07, Vol.45 (13), p.4255-4277
Main Authors: Yu, Mengfan, Duan, Yulong, Wan, You, Lu, Xin, Lyu, Shubin, Li, Fusheng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contours, overlooking single tiny objects encountered in real-world flight scenarios. Extracting appearance information from these diminutive objects poses a considerable challenge for object tracking. To rectify this imbalance in data distribution, we proposed a UAV dataset called Overhead Look Of Drones (OLOD), encompassing 70 sequences meticulously designed to address tiny object tracking. It contains over 55k frames and provides supplementary information about altitude and flight attitude. Additionally, we incorporated 11 challenging attributes to enhance the complexity of the scenes, thereby establishing a comprehensive benchmark for single object tracking. OLOD serves as a valuable tool for evaluating the tracking capabilities of various algorithms when it comes to tiny objects. Subsequently, through experimental results, we shed light on the limitations of existing methods for tracking tiny objects on this benchmark, underscoring the necessity for further research in this field. Our dataset and evaluation code will be released at https://github.com/yuymf/OLOD .
ISSN:0143-1161
1366-5901
1366-5901
DOI:10.1080/01431161.2024.2354127