Loading…

Hybrid generative–discriminative hash tracking with spatio-temporal contextual cues

Visual object tracking is of a great application value in video monitoring systems. Recent work on video tracking has taken into account spatial relationship between the targeted object and its background. In this paper, the spatial relationship is combined with the temporal relationship between fea...

Full description

Saved in:
Bibliographic Details
Published in:Neural computing & applications 2018, Vol.29 (2), p.389-399
Main Authors: Dai, Manna, Cheng, Shuying, He, Xiangjian
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Visual object tracking is of a great application value in video monitoring systems. Recent work on video tracking has taken into account spatial relationship between the targeted object and its background. In this paper, the spatial relationship is combined with the temporal relationship between features on different video frames so that a real-time tracker is designed based on a hash algorithm with spatio-temporal cues. Different from most of the existing work on video tracking, which is regarded as a mechanism for image matching or image classification alone, we propose a hierarchical framework and conduct both matching and classification tasks to generate a coarse-to-fine tracking system. We develop a generative model under a modified particle filter with hash fingerprints for the coarse matching by the maximum a posteriori and a discriminative model for the fine classification by maximizing a confidence map based on a context model. The confidence map reveals the spatio-temporal dynamics of the target. Because hash fingerprint is merely a binary vector and the modified particle filter uses only a small number of particles, our tracker has a low computation cost. By conducting experiments on eight challenging video sequences from a public benchmark, we demonstrate that our tracker outperforms eight state-of-the-art trackers in terms of both accuracy and speed.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-016-2452-z