Loading…

Learning Local Structured Correlation Filters for Visual Tracking via Spatial Joint Regularization

Robust visual tracking is a fundamental problem in the field of computer vision and has a wide range of practical applications. Recent progress in developing robust tracking methods are mainly made upon discriminative correlation filters (DCF). However, most DCF-based methods develop their trackers...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.39158-39171
Main Authors: Guo, Chenggang, Chen, Dongyi, Huang, Zhiqi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robust visual tracking is a fundamental problem in the field of computer vision and has a wide range of practical applications. Recent progress in developing robust tracking methods are mainly made upon discriminative correlation filters (DCF). However, most DCF-based methods develop their trackers under the assumption of a holistic appearance model, ignoring the underlying spatial local structural information. In this paper, we introduce the tree-structured group sparsity regularization into the DCF-based formula. The correlation filter to be learned is divided into hierarchical local groups. The relationship between the response and the circularly shifted target appearance is regularized by applying the l 1 -norm across the l 2 norm of the hierarchical local filter groups. Moreover, a local response consistency term is incorporated together with the structured sparsity to make each local filter group contributes equally to the final response. The accelerated proximal gradient method is employed to optimize this non-smooth composite regularization problem. Benefiting from the properties of circulant matrices, several key steps in the optimization process can be efficiently solved in the frequency domain. The experiments are conducted on four publicly available visual tracking benchmarks. Both quantitative and qualitative evaluations demonstrate that the proposed tracking method performs favorably against a number of state-of-the-art tracking methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2906508