Loading…
Visual Object Tracking With Mutual Affinity Aligned to Human Intuition
Single-object tracking generally advances by incrementally determining the tracked target's position through interactions between the search region and the template. However, the template provides less information than does the search region in terms of both temporal cues and spatial resolution...
Saved in:
Published in: | IEEE transactions on multimedia 2024, Vol.26, p.10055-10068 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Single-object tracking generally advances by incrementally determining the tracked target's position through interactions between the search region and the template. However, the template provides less information than does the search region in terms of both temporal cues and spatial resolution. To alleviate this imbalance, we introduce an anthropic tracking framework, MATrack (Mutual Affinity Tracker), which explicitly strengthens weak template information and implicitly reduces background clutter through interactions between multiple templates and the search region. Additionally, we propose a coarse-to-fine localization approach that combines the benefits of corner-based and center-based methods. This approach enables us to simultaneously update the most recent state and background information without two-stage training. MATrack achieves state-of-the-art performance on multiple test benchmarks, including GOT-10k, LASOT, TrackingNet, OTB-100, UAV123, and NFS30. Among these benchmarks, MATrack-320's performance stands out, particularly in the short-term tracking dataset GOT-10k, where it achieves an accuracy overlap (AO) of 77.3. We also conduct comprehensive quantitative and qualitative evaluations to demonstrate that our method significantly outperforms other state-of-the-art approaches. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2024.3405654 |