Loading…
Neuromorphic Computing Using Random Synaptic Feedback Weights for Error Backpropagation in NAND Flash Memory-Based Synaptic Devices
This work proposes utilizing separate synaptic string array for error backpropagation in NAND flash memory-based synaptic architecture with random synaptic feedback weight. To enable error backpropagation, forward and backward propagations are processed in separate synaptic devices in forward and ba...
Saved in:
Published in: | IEEE transactions on electron devices 2023-03, Vol.70 (3), p.1-6 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This work proposes utilizing separate synaptic string array for error backpropagation in NAND flash memory-based synaptic architecture with random synaptic feedback weight. To enable error backpropagation, forward and backward propagations are processed in separate synaptic devices in forward and backward synaptic arrays, respectively. In addition, synaptic weights in forward synaptic array are updated at each iteration, while those in backward synaptic array are fixed to reduce burden of peripheral circuits and power consumption. The optimal conductance response is investigated considering the linearity of the conductance response and the ratio of maximum and minimum currents. Reliability characteristics are verified by retention, endurance, and pass bias disturbance measurement results. Hardware-based neural networks with random synaptic weight achieve an inference accuracy of 95.41%, which is comparable to that of 95.58% obtained with transposed weight. Hardware-based neural network simulations demonstrate that the inference accuracy of the proposed on-chip learning scheme hardly decreases compared to that of the off-chip learning even with increasing device variation. |
---|---|
ISSN: | 0018-9383 1557-9646 |
DOI: | 10.1109/TED.2023.3237670 |