Loading…

Analog Resistive Switching Devices for Training Deep Neural Networks with the Novel Tiki-Taka Algorithm

A critical bottleneck for the training of large neural networks (NNs) is communication with off-chip memory. A promising mitigation effort consists of integrating crossbar arrays of analogue memories in the Back-End-Of-Line, to store the NN parameters and efficiently perform the required synaptic op...

Full description

Saved in:
Bibliographic Details
Published in:Nano letters 2024-01, Vol.24 (3), p.866-872
Main Authors: Stecconi, Tommaso, Bragaglia, Valeria, Rasch, Malte J., Carta, Fabio, Horst, Folkert, Falcone, Donato F., ten Kate, Sofieke C., Gong, Nanbo, Ando, Takashi, Olziersky, Antonis, Offrein, Bert
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A critical bottleneck for the training of large neural networks (NNs) is communication with off-chip memory. A promising mitigation effort consists of integrating crossbar arrays of analogue memories in the Back-End-Of-Line, to store the NN parameters and efficiently perform the required synaptic operations. The “Tiki-Taka” algorithm was developed to facilitate NN training in the presence of device nonidealities. However, so far, a resistive switching device exhibiting all the fundamental Tiki-Taka requirements, which are many programmable states, a centered symmetry point, and low programming noise, was not yet demonstrated. Here, a complementary metal-oxide semiconductor (CMOS)-compatible resistive random access memory (RRAM), showing more than 30 programmable states with low noise and a symmetry point with only 5% skew from the center, is presented for the first time. These results enable generalization of Tiki-Taka training from small fully connected networks to larger long-/short-term-memory types of NN.
ISSN:1530-6984
1530-6992
1530-6992
DOI:10.1021/acs.nanolett.3c03697