Loading…
Improved SwinTrack single target tracking algorithm based on spatio‐temporal feature fusion
Single target tracking based on computer vision helps to collect, analyse and exploit target information. The SwinTrack algorithm has received widespread attention as one of the twin network algorithms with the best trade‐off between tracking accuracy and speed, but it also suffers from the insuffic...
Saved in:
Published in: | IET image processing 2023-06, Vol.17 (8), p.2410-2421 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Single target tracking based on computer vision helps to collect, analyse and exploit target information. The SwinTrack algorithm has received widespread attention as one of the twin network algorithms with the best trade‐off between tracking accuracy and speed, but it also suffers from the insufficient fusion of deep and shallow features leading to loss of shallow information and insufficient use of temporal information leading to inconsistency between target and template. Semantic information and detailed information are combined and multiple convolutional forms are introduced to propose a multi‐level feature fusion strategy to effectively fuse features in space. Besides, based on the idea of feedback, a dynamic template branching approach is also designed to fuse temporal features and enhance the representation of target features. The effectiveness of this method was verified on the OTB100 and GOT10K datasets.
Semantic information and detailed information are combined and multiple convolutional forms are introduced to propose a multi‐level feature fusion strategy to effectively fuse features in space. Besides, based on the idea of feedback, a dynamic template branching approach is also designed to fuse temporal features and enhance the representation of target features. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12803 |