Loading…
SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models
Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic tempor...
Saved in:
Published in: | arXiv.org 2024-08 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Shen, Shuaijie Wang, Chao Huang, Renzhuo Zhong, Yan Guo, Qinghai Lu, Zhichao Zhang, Jianguo Leng, Luziwei |
description | Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic temporal dynamics. In this work, we develop spiking state space models (SpikingSSMs) for long sequence learning by leveraging on the sequence learning abilities of state space models (SSMs). Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block, meanwhile realizing sparse synaptic computation. Furthermore, to solve the conflict of event-driven neuronal dynamics with parallel computing, we propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds, enabling orders of acceleration in training speed compared with conventional iterative methods. On the long range arena benchmark task, SpikingSSM achieves competitive performance to state-of-the-art SSMs meanwhile realizing on average 90\% of network sparsity. On language modeling, our network significantly surpasses existing spiking large language models (spikingLLMs) on the WikiText-103 dataset with only a third of the model size, demonstrating its potential as backbone architecture for low computation cost LLMs. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3097950914</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3097950914</sourcerecordid><originalsourceid>FETCH-proquest_journals_30979509143</originalsourceid><addsrcrecordid>eNqNjEEKwjAURIMgWLR3-OC6kCattW5FcdGCEBfuSmi_2hqSmqR4fSN4ADczDPNmZiRinKfJNmNsQWLnBkop2xQsz3lErmLsn72-C1G7HVQorQ4JKhNE4GtC3aKDd-8fIEZpHYLUHZyllUqhgt8ahJcev0SLUJsOlVuR-U0qh_HPl2R9PFz2p2S0Jtw63wxmsjpUDadlUea0TDP-H_UBMxBBLA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3097950914</pqid></control><display><type>article</type><title>SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models</title><source>ProQuest Publicly Available Content database</source><creator>Shen, Shuaijie ; Wang, Chao ; Huang, Renzhuo ; Zhong, Yan ; Guo, Qinghai ; Lu, Zhichao ; Zhang, Jianguo ; Leng, Luziwei</creator><creatorcontrib>Shen, Shuaijie ; Wang, Chao ; Huang, Renzhuo ; Zhong, Yan ; Guo, Qinghai ; Lu, Zhichao ; Zhang, Jianguo ; Leng, Luziwei</creatorcontrib><description>Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic temporal dynamics. In this work, we develop spiking state space models (SpikingSSMs) for long sequence learning by leveraging on the sequence learning abilities of state space models (SSMs). Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block, meanwhile realizing sparse synaptic computation. Furthermore, to solve the conflict of event-driven neuronal dynamics with parallel computing, we propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds, enabling orders of acceleration in training speed compared with conventional iterative methods. On the long range arena benchmark task, SpikingSSM achieves competitive performance to state-of-the-art SSMs meanwhile realizing on average 90\% of network sparsity. On language modeling, our network significantly surpasses existing spiking large language models (spikingLLMs) on the WikiText-103 dataset with only a third of the model size, demonstrating its potential as backbone architecture for low computation cost LLMs.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Acceleration ; Artificial neural networks ; Energy consumption ; Iterative methods ; Large language models ; Neural networks ; Spiking ; State space models</subject><ispartof>arXiv.org, 2024-08</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3097950914?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Shen, Shuaijie</creatorcontrib><creatorcontrib>Wang, Chao</creatorcontrib><creatorcontrib>Huang, Renzhuo</creatorcontrib><creatorcontrib>Zhong, Yan</creatorcontrib><creatorcontrib>Guo, Qinghai</creatorcontrib><creatorcontrib>Lu, Zhichao</creatorcontrib><creatorcontrib>Zhang, Jianguo</creatorcontrib><creatorcontrib>Leng, Luziwei</creatorcontrib><title>SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models</title><title>arXiv.org</title><description>Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic temporal dynamics. In this work, we develop spiking state space models (SpikingSSMs) for long sequence learning by leveraging on the sequence learning abilities of state space models (SSMs). Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block, meanwhile realizing sparse synaptic computation. Furthermore, to solve the conflict of event-driven neuronal dynamics with parallel computing, we propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds, enabling orders of acceleration in training speed compared with conventional iterative methods. On the long range arena benchmark task, SpikingSSM achieves competitive performance to state-of-the-art SSMs meanwhile realizing on average 90\% of network sparsity. On language modeling, our network significantly surpasses existing spiking large language models (spikingLLMs) on the WikiText-103 dataset with only a third of the model size, demonstrating its potential as backbone architecture for low computation cost LLMs.</description><subject>Acceleration</subject><subject>Artificial neural networks</subject><subject>Energy consumption</subject><subject>Iterative methods</subject><subject>Large language models</subject><subject>Neural networks</subject><subject>Spiking</subject><subject>State space models</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNjEEKwjAURIMgWLR3-OC6kCattW5FcdGCEBfuSmi_2hqSmqR4fSN4ADczDPNmZiRinKfJNmNsQWLnBkop2xQsz3lErmLsn72-C1G7HVQorQ4JKhNE4GtC3aKDd-8fIEZpHYLUHZyllUqhgt8ahJcev0SLUJsOlVuR-U0qh_HPl2R9PFz2p2S0Jtw63wxmsjpUDadlUea0TDP-H_UBMxBBLA</recordid><startdate>20240827</startdate><enddate>20240827</enddate><creator>Shen, Shuaijie</creator><creator>Wang, Chao</creator><creator>Huang, Renzhuo</creator><creator>Zhong, Yan</creator><creator>Guo, Qinghai</creator><creator>Lu, Zhichao</creator><creator>Zhang, Jianguo</creator><creator>Leng, Luziwei</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240827</creationdate><title>SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models</title><author>Shen, Shuaijie ; Wang, Chao ; Huang, Renzhuo ; Zhong, Yan ; Guo, Qinghai ; Lu, Zhichao ; Zhang, Jianguo ; Leng, Luziwei</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30979509143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Acceleration</topic><topic>Artificial neural networks</topic><topic>Energy consumption</topic><topic>Iterative methods</topic><topic>Large language models</topic><topic>Neural networks</topic><topic>Spiking</topic><topic>State space models</topic><toplevel>online_resources</toplevel><creatorcontrib>Shen, Shuaijie</creatorcontrib><creatorcontrib>Wang, Chao</creatorcontrib><creatorcontrib>Huang, Renzhuo</creatorcontrib><creatorcontrib>Zhong, Yan</creatorcontrib><creatorcontrib>Guo, Qinghai</creatorcontrib><creatorcontrib>Lu, Zhichao</creatorcontrib><creatorcontrib>Zhang, Jianguo</creatorcontrib><creatorcontrib>Leng, Luziwei</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>ProQuest Publicly Available Content database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shen, Shuaijie</au><au>Wang, Chao</au><au>Huang, Renzhuo</au><au>Zhong, Yan</au><au>Guo, Qinghai</au><au>Lu, Zhichao</au><au>Zhang, Jianguo</au><au>Leng, Luziwei</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models</atitle><jtitle>arXiv.org</jtitle><date>2024-08-27</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Known as low energy consumption networks, spiking neural networks (SNNs) have gained a lot of attention within the past decades. While SNNs are increasing competitive with artificial neural networks (ANNs) for vision tasks, they are rarely used for long sequence tasks, despite their intrinsic temporal dynamics. In this work, we develop spiking state space models (SpikingSSMs) for long sequence learning by leveraging on the sequence learning abilities of state space models (SSMs). Inspired by dendritic neuron structure, we hierarchically integrate neuronal dynamics with the original SSM block, meanwhile realizing sparse synaptic computation. Furthermore, to solve the conflict of event-driven neuronal dynamics with parallel computing, we propose a light-weight surrogate dynamic network which accurately predicts the after-reset membrane potential and compatible to learnable thresholds, enabling orders of acceleration in training speed compared with conventional iterative methods. On the long range arena benchmark task, SpikingSSM achieves competitive performance to state-of-the-art SSMs meanwhile realizing on average 90\% of network sparsity. On language modeling, our network significantly surpasses existing spiking large language models (spikingLLMs) on the WikiText-103 dataset with only a third of the model size, demonstrating its potential as backbone architecture for low computation cost LLMs.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-08 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_3097950914 |
source | ProQuest Publicly Available Content database |
subjects | Acceleration Artificial neural networks Energy consumption Iterative methods Large language models Neural networks Spiking State space models |
title | SpikingSSMs: Learning Long Sequences with Sparse and Parallel Spiking State Space Models |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T15%3A00%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=SpikingSSMs:%20Learning%20Long%20Sequences%20with%20Sparse%20and%20Parallel%20Spiking%20State%20Space%20Models&rft.jtitle=arXiv.org&rft.au=Shen,%20Shuaijie&rft.date=2024-08-27&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3097950914%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_30979509143%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3097950914&rft_id=info:pmid/&rfr_iscdi=true |