Loading…
Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
Recently, spiking neural networks (SNNs), the third generation of neural networks, have shown remarkable capabilities of energy-efficient computing, which is a promising alternative for artificial neural networks (ANNs) with high energy consumption. SNNs have reached competitive results compared to...
Saved in:
Published in: | Neural computing & applications 2022-06, Vol.34 (12), p.9967-9982 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Recently, spiking neural networks (SNNs), the third generation of neural networks, have shown remarkable capabilities of energy-efficient computing, which is a promising alternative for artificial neural networks (ANNs) with high energy consumption. SNNs have reached competitive results compared to ANNs in relatively simple tasks and small datasets such as image classification and MNIST/CIFAR, while few studies on more challenging vision tasks on complex datasets. In this paper, we focus on extending deep SNNs to object tracking, a more advanced vision task with embedded applications and energy-saving requirements. We present a spike-based Siamese network called SiamSNN, which is converted from fully convolutional Siamese networks. Specifically, we propose a spiking correlation layer to evaluate the similarity between two spiking feature maps, and introduce a novel two-status coding scheme to optimize the temporal distribution of output spike trains for further improvements. SiamSNN is the first deep SNN tracker that achieves short latency and low precision degradation on the visual object tracking benchmarks OTB-2013, OTB-2015, VOT-2016, VOT-2018, and GOT-10k. Moreover, SiamSNN achieves notably low energy consumption and real-time on Neuromorphic chip TrueNorth. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-022-06984-1 |