Loading…

Effective Active Learning Method for Spiking Neural Networks

A large quantity of labeled data is required to train high-performance deep spiking neural networks (SNNs), but obtaining labeled data is expensive. Active learning is proposed to reduce the quantity of labeled data required by deep learning models. However, conventional active learning methods in S...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2024-09, Vol.35 (9), p.12373-12382
Main Authors: Xie, Xiurui, Yu, Bei, Liu, Guisong, Zhan, Qiugang, Tang, Huajin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A large quantity of labeled data is required to train high-performance deep spiking neural networks (SNNs), but obtaining labeled data is expensive. Active learning is proposed to reduce the quantity of labeled data required by deep learning models. However, conventional active learning methods in SNNs are not as effective as that in conventional artificial neural networks (ANNs) because of the difference in feature representation and information transmission. To address this issue, we propose an effective active learning method for a deep SNN model in this article. Specifically, a loss prediction module ActiveLossNet is proposed to extract features and select valuable samples for deep SNNs. Then, we derive the corresponding active learning algorithm for deep SNN models. Comprehensive experiments are conducted on CIFAR-10, MNIST, Fashion-MNIST, and SVHN on different SNN frameworks, including seven-layer CIFARNet and 20-layer ResNet-18. The comparison results demonstrate that the proposed active learning algorithm outperforms random selection and conventional ANN active learning methods. In addition, our method converges faster than conventional active learning methods.
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2023.3257333