Loading…
NUTS-BSNN: A non-uniform time-step binarized spiking neural network with energy-efficient in-memory computing macro
This work introduces a network architecture NUTS-BSNN: A Non-uniform Time-step Binarized Spiking Neural Network. NUTS-BSNN is a fully binarized spiking neural network with all binary weights, including the input and output layers. In the input and output layers, the weights are represented as stocha...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2023-12, Vol.560, p.126838, Article 126838 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This work introduces a network architecture NUTS-BSNN: A Non-uniform Time-step Binarized Spiking Neural Network. NUTS-BSNN is a fully binarized spiking neural network with all binary weights, including the input and output layers. In the input and output layers, the weights are represented as stochastic series of numbers, while in the hidden layers, they are approximated to binary values for using simple XNOR-based computations. To compensate for the information loss due to binarization, we increased the convolutions at the input layer sequentially computed over multiple time-steps. The results from these operations are accumulated before generating spikes for the subsequent layers to increase the overall performance. We chose 14 time-steps for accumulation to achieve a good tradeoff between performance and inference latency. The proposed technique was evaluated using three datasets by direct training method and using a surrogate gradient algorithm. We achieved classification accuracies of 93.25%, 88.71%, and 70.31% on the Fashion-MNIST, CIFAR-10, and CIFAR-100 datasets, respectively. Further, we present an in-memory computing architecture for NUTS-BSNN, which limits resource and power consumption for hardware implementation. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2023.126838 |