Loading…

OLOD: a new UAV dataset and benchmark for single tiny object tracking

The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contou...

Full description

Saved in:
Bibliographic Details
Published in:International journal of remote sensing 2024-07, Vol.45 (13), p.4255-4277
Main Authors: Yu, Mengfan, Duan, Yulong, Wan, You, Lu, Xin, Lyu, Shubin, Li, Fusheng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c249t-8e5581f24a51f35156b62456289b352d6ebf69f9c23c153cfc2fe06f25e5dd653
container_end_page 4277
container_issue 13
container_start_page 4255
container_title International journal of remote sensing
container_volume 45
creator Yu, Mengfan
Duan, Yulong
Wan, You
Lu, Xin
Lyu, Shubin
Li, Fusheng
description The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contours, overlooking single tiny objects encountered in real-world flight scenarios. Extracting appearance information from these diminutive objects poses a considerable challenge for object tracking. To rectify this imbalance in data distribution, we proposed a UAV dataset called Overhead Look Of Drones (OLOD), encompassing 70 sequences meticulously designed to address tiny object tracking. It contains over 55k frames and provides supplementary information about altitude and flight attitude. Additionally, we incorporated 11 challenging attributes to enhance the complexity of the scenes, thereby establishing a comprehensive benchmark for single object tracking. OLOD serves as a valuable tool for evaluating the tracking capabilities of various algorithms when it comes to tiny objects. Subsequently, through experimental results, we shed light on the limitations of existing methods for tracking tiny objects on this benchmark, underscoring the necessity for further research in this field. Our dataset and evaluation code will be released at https://github.com/yuymf/OLOD .
doi_str_mv 10.1080/01431161.2024.2354127
format article
fullrecord <record><control><sourceid>proquest_infor</sourceid><recordid>TN_cdi_proquest_miscellaneous_3153672516</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3072944054</sourcerecordid><originalsourceid>FETCH-LOGICAL-c249t-8e5581f24a51f35156b62456289b352d6ebf69f9c23c153cfc2fe06f25e5dd653</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWKs_QQh48bI1X5PterJo_YBCL9ZryGYT3XabrcmW0n9vltaLB08Dw_O-zDwIXVMyomRM7ggVnFJJR4wwMWIcBGX5CRpQLmUGBaGnaNAzWQ-do4sYl4QQmUM-QNP5bP50jzX2docXkw9c6U5H22HtK1xab77WOqywawOOtf9sLO5qv8dtubSmw13QZpXWl-jM6Sbaq-McosXz9P3xNZvNX94eJ7PMMFF02dgCjKljQgN1HCjIUjIBko2LkgOrpC2dLFxhGDcUuHGGOUukY2ChqiTwIbo99G5C-721sVPrOhrbNNrbdhsVTymZM6AyoTd_0GW7DT5dpzjJWSEEAZEoOFAmtDEG69Qm1OnjvaJE9XLVr1zVy1VHuSn3cMjVPqlZ610bmkp1et-0wQXtTd0f82_FDzfDfV4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3072944054</pqid></control><display><type>article</type><title>OLOD: a new UAV dataset and benchmark for single tiny object tracking</title><source>Taylor and Francis Science and Technology Collection</source><creator>Yu, Mengfan ; Duan, Yulong ; Wan, You ; Lu, Xin ; Lyu, Shubin ; Li, Fusheng</creator><creatorcontrib>Yu, Mengfan ; Duan, Yulong ; Wan, You ; Lu, Xin ; Lyu, Shubin ; Li, Fusheng</creatorcontrib><description>The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contours, overlooking single tiny objects encountered in real-world flight scenarios. Extracting appearance information from these diminutive objects poses a considerable challenge for object tracking. To rectify this imbalance in data distribution, we proposed a UAV dataset called Overhead Look Of Drones (OLOD), encompassing 70 sequences meticulously designed to address tiny object tracking. It contains over 55k frames and provides supplementary information about altitude and flight attitude. Additionally, we incorporated 11 challenging attributes to enhance the complexity of the scenes, thereby establishing a comprehensive benchmark for single object tracking. OLOD serves as a valuable tool for evaluating the tracking capabilities of various algorithms when it comes to tiny objects. Subsequently, through experimental results, we shed light on the limitations of existing methods for tracking tiny objects on this benchmark, underscoring the necessity for further research in this field. Our dataset and evaluation code will be released at https://github.com/yuymf/OLOD .</description><identifier>ISSN: 0143-1161</identifier><identifier>ISSN: 1366-5901</identifier><identifier>EISSN: 1366-5901</identifier><identifier>DOI: 10.1080/01431161.2024.2354127</identifier><language>eng</language><publisher>London: Taylor &amp; Francis</publisher><subject>Algorithms ; altitude ; Benchmarks ; Computer vision ; data collection ; Datasets ; Drone aircraft ; Flight ; single object tracking ; tiny object ; Tracking ; UAV tracking dataset ; Unmanned aerial vehicles</subject><ispartof>International journal of remote sensing, 2024-07, Vol.45 (13), p.4255-4277</ispartof><rights>2024 Informa UK Limited, trading as Taylor &amp; Francis Group 2024</rights><rights>2024 Informa UK Limited, trading as Taylor &amp; Francis Group</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c249t-8e5581f24a51f35156b62456289b352d6ebf69f9c23c153cfc2fe06f25e5dd653</cites><orcidid>0000-0002-4454-4646 ; 0000-0001-7491-5535</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Yu, Mengfan</creatorcontrib><creatorcontrib>Duan, Yulong</creatorcontrib><creatorcontrib>Wan, You</creatorcontrib><creatorcontrib>Lu, Xin</creatorcontrib><creatorcontrib>Lyu, Shubin</creatorcontrib><creatorcontrib>Li, Fusheng</creatorcontrib><title>OLOD: a new UAV dataset and benchmark for single tiny object tracking</title><title>International journal of remote sensing</title><description>The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contours, overlooking single tiny objects encountered in real-world flight scenarios. Extracting appearance information from these diminutive objects poses a considerable challenge for object tracking. To rectify this imbalance in data distribution, we proposed a UAV dataset called Overhead Look Of Drones (OLOD), encompassing 70 sequences meticulously designed to address tiny object tracking. It contains over 55k frames and provides supplementary information about altitude and flight attitude. Additionally, we incorporated 11 challenging attributes to enhance the complexity of the scenes, thereby establishing a comprehensive benchmark for single object tracking. OLOD serves as a valuable tool for evaluating the tracking capabilities of various algorithms when it comes to tiny objects. Subsequently, through experimental results, we shed light on the limitations of existing methods for tracking tiny objects on this benchmark, underscoring the necessity for further research in this field. Our dataset and evaluation code will be released at https://github.com/yuymf/OLOD .</description><subject>Algorithms</subject><subject>altitude</subject><subject>Benchmarks</subject><subject>Computer vision</subject><subject>data collection</subject><subject>Datasets</subject><subject>Drone aircraft</subject><subject>Flight</subject><subject>single object tracking</subject><subject>tiny object</subject><subject>Tracking</subject><subject>UAV tracking dataset</subject><subject>Unmanned aerial vehicles</subject><issn>0143-1161</issn><issn>1366-5901</issn><issn>1366-5901</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LAzEQhoMoWKs_QQh48bI1X5PterJo_YBCL9ZryGYT3XabrcmW0n9vltaLB08Dw_O-zDwIXVMyomRM7ggVnFJJR4wwMWIcBGX5CRpQLmUGBaGnaNAzWQ-do4sYl4QQmUM-QNP5bP50jzX2docXkw9c6U5H22HtK1xab77WOqywawOOtf9sLO5qv8dtubSmw13QZpXWl-jM6Sbaq-McosXz9P3xNZvNX94eJ7PMMFF02dgCjKljQgN1HCjIUjIBko2LkgOrpC2dLFxhGDcUuHGGOUukY2ChqiTwIbo99G5C-721sVPrOhrbNNrbdhsVTymZM6AyoTd_0GW7DT5dpzjJWSEEAZEoOFAmtDEG69Qm1OnjvaJE9XLVr1zVy1VHuSn3cMjVPqlZ610bmkp1et-0wQXtTd0f82_FDzfDfV4</recordid><startdate>20240702</startdate><enddate>20240702</enddate><creator>Yu, Mengfan</creator><creator>Duan, Yulong</creator><creator>Wan, You</creator><creator>Lu, Xin</creator><creator>Lyu, Shubin</creator><creator>Li, Fusheng</creator><general>Taylor &amp; Francis</general><general>Taylor &amp; Francis Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7TG</scope><scope>7TN</scope><scope>8FD</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KL.</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>7S9</scope><scope>L.6</scope><orcidid>https://orcid.org/0000-0002-4454-4646</orcidid><orcidid>https://orcid.org/0000-0001-7491-5535</orcidid></search><sort><creationdate>20240702</creationdate><title>OLOD: a new UAV dataset and benchmark for single tiny object tracking</title><author>Yu, Mengfan ; Duan, Yulong ; Wan, You ; Lu, Xin ; Lyu, Shubin ; Li, Fusheng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c249t-8e5581f24a51f35156b62456289b352d6ebf69f9c23c153cfc2fe06f25e5dd653</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>altitude</topic><topic>Benchmarks</topic><topic>Computer vision</topic><topic>data collection</topic><topic>Datasets</topic><topic>Drone aircraft</topic><topic>Flight</topic><topic>single object tracking</topic><topic>tiny object</topic><topic>Tracking</topic><topic>UAV tracking dataset</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yu, Mengfan</creatorcontrib><creatorcontrib>Duan, Yulong</creatorcontrib><creatorcontrib>Wan, You</creatorcontrib><creatorcontrib>Lu, Xin</creatorcontrib><creatorcontrib>Lyu, Shubin</creatorcontrib><creatorcontrib>Li, Fusheng</creatorcontrib><collection>CrossRef</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Oceanic Abstracts</collection><collection>Technology Research Database</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>AGRICOLA</collection><collection>AGRICOLA - Academic</collection><jtitle>International journal of remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yu, Mengfan</au><au>Duan, Yulong</au><au>Wan, You</au><au>Lu, Xin</au><au>Lyu, Shubin</au><au>Li, Fusheng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>OLOD: a new UAV dataset and benchmark for single tiny object tracking</atitle><jtitle>International journal of remote sensing</jtitle><date>2024-07-02</date><risdate>2024</risdate><volume>45</volume><issue>13</issue><spage>4255</spage><epage>4277</epage><pages>4255-4277</pages><issn>0143-1161</issn><issn>1366-5901</issn><eissn>1366-5901</eissn><abstract>The integration of visual data obtained from unmanned aerial vehicles (UAVs) has ushered in an era of computer vision, greatly expanding the possibilities for object tracking applications. Nevertheless, existing UAV datasets predominantly focus on large-scale objects characterized by distinct contours, overlooking single tiny objects encountered in real-world flight scenarios. Extracting appearance information from these diminutive objects poses a considerable challenge for object tracking. To rectify this imbalance in data distribution, we proposed a UAV dataset called Overhead Look Of Drones (OLOD), encompassing 70 sequences meticulously designed to address tiny object tracking. It contains over 55k frames and provides supplementary information about altitude and flight attitude. Additionally, we incorporated 11 challenging attributes to enhance the complexity of the scenes, thereby establishing a comprehensive benchmark for single object tracking. OLOD serves as a valuable tool for evaluating the tracking capabilities of various algorithms when it comes to tiny objects. Subsequently, through experimental results, we shed light on the limitations of existing methods for tracking tiny objects on this benchmark, underscoring the necessity for further research in this field. Our dataset and evaluation code will be released at https://github.com/yuymf/OLOD .</abstract><cop>London</cop><pub>Taylor &amp; Francis</pub><doi>10.1080/01431161.2024.2354127</doi><tpages>23</tpages><orcidid>https://orcid.org/0000-0002-4454-4646</orcidid><orcidid>https://orcid.org/0000-0001-7491-5535</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0143-1161
ispartof International journal of remote sensing, 2024-07, Vol.45 (13), p.4255-4277
issn 0143-1161
1366-5901
1366-5901
language eng
recordid cdi_proquest_miscellaneous_3153672516
source Taylor and Francis Science and Technology Collection
subjects Algorithms
altitude
Benchmarks
Computer vision
data collection
Datasets
Drone aircraft
Flight
single object tracking
tiny object
Tracking
UAV tracking dataset
Unmanned aerial vehicles
title OLOD: a new UAV dataset and benchmark for single tiny object tracking
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-31T04%3A57%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_infor&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=OLOD:%20a%20new%20UAV%20dataset%20and%20benchmark%20for%20single%20tiny%20object%20tracking&rft.jtitle=International%20journal%20of%20remote%20sensing&rft.au=Yu,%20Mengfan&rft.date=2024-07-02&rft.volume=45&rft.issue=13&rft.spage=4255&rft.epage=4277&rft.pages=4255-4277&rft.issn=0143-1161&rft.eissn=1366-5901&rft_id=info:doi/10.1080/01431161.2024.2354127&rft_dat=%3Cproquest_infor%3E3072944054%3C/proquest_infor%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c249t-8e5581f24a51f35156b62456289b352d6ebf69f9c23c153cfc2fe06f25e5dd653%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3072944054&rft_id=info:pmid/&rfr_iscdi=true