Loading…

Improved SwinTrack single target tracking algorithm based on spatio‐temporal feature fusion

Single target tracking based on computer vision helps to collect, analyse and exploit target information. The SwinTrack algorithm has received widespread attention as one of the twin network algorithms with the best trade‐off between tracking accuracy and speed, but it also suffers from the insuffic...

Full description

Saved in:
Bibliographic Details
Published in:IET image processing 2023-06, Vol.17 (8), p.2410-2421
Main Authors: Zhao, Min, Yue, Qiang, Sun, Dihua, Zhong, Yuan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Single target tracking based on computer vision helps to collect, analyse and exploit target information. The SwinTrack algorithm has received widespread attention as one of the twin network algorithms with the best trade‐off between tracking accuracy and speed, but it also suffers from the insufficient fusion of deep and shallow features leading to loss of shallow information and insufficient use of temporal information leading to inconsistency between target and template. Semantic information and detailed information are combined and multiple convolutional forms are introduced to propose a multi‐level feature fusion strategy to effectively fuse features in space. Besides, based on the idea of feedback, a dynamic template branching approach is also designed to fuse temporal features and enhance the representation of target features. The effectiveness of this method was verified on the OTB100 and GOT10K datasets. Semantic information and detailed information are combined and multiple convolutional forms are introduced to propose a multi‐level feature fusion strategy to effectively fuse features in space. Besides, based on the idea of feedback, a dynamic template branching approach is also designed to fuse temporal features and enhance the representation of target features.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12803