Loading…

Long short-term memory networks in memristor crossbars

Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units has led to major advances in artificial intelligence. State-of-the-art LSTM models with significantly increased complexity and a large number of parameters, however, have a bottleneck in computing power r...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2018-05
Main Authors: Li, Can, Wang, Zhongrui, Rao, Mingyi, Belkin, Daniel, Song, Wenhao, Jiang, Hao, Peng, Yan, Li, Yunning, Lin, Peng, Hu, Miao, Ge, Ning, Strachan, John Paul, Barnell, Mark, Wu, Qing, R Stanley Williams, Yang, J Joshua, Xia, Qiangfei
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units has led to major advances in artificial intelligence. State-of-the-art LSTM models with significantly increased complexity and a large number of parameters, however, have a bottleneck in computing power resulting from limited memory capacity and data communication bandwidth. Here we demonstrate experimentally that LSTM can be implemented with a memristor crossbar, which has a small circuit footprint to store a large number of parameters and in-memory computing capability that circumvents the 'von Neumann bottleneck'. We illustrate the capability of our system by solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference.
ISSN:2331-8422
DOI:10.48550/arxiv.1805.11801