Loading…

Efficient Convolution Neural Networks for Object Tracking Using Separable Convolution and Filter Pruning

Object tracking based on deep learning is a hot topic in computer vision with many applications. Due to high computation and memory costs, it is difficult to deploy convolutional neural networks (CNNs) for object tracking on embedded systems with limited hardware resources. This paper uses the Siame...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.106466-106474
Main Authors: Mao, Yuanhong, He, Zhanzhuang, Ma, Zhong, Tang, Xuehan, Wang, Zhuping
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Object tracking based on deep learning is a hot topic in computer vision with many applications. Due to high computation and memory costs, it is difficult to deploy convolutional neural networks (CNNs) for object tracking on embedded systems with limited hardware resources. This paper uses the Siamese network to construct the backbone of our tracker. The convolution layers used to extract features often have the highest costs, so more improvements should be focused on them to make the tracking more efficient. In this paper, the standard convolution is optimized by the separable convolution, which mainly includes a depthwise convolution and a pointwise convolution. To further reduce the calculation, filters in the depthwise convolution layer are pruned with filters variance. As there are different weight distributions in convolution layers, the filter pruning is guided by a hyper-parameter designed. With the improvements, the number of parameters is decreased to 13% of the original network and the computation is reduced to 23%. On the NVIDIA Jetson TX2, the tracking speed increased to 3.65 times on the CPU and 2.08 times on the GPU, without significant degradation of tracking performance in VOT benchmark.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2932733