Loading…
Low-energy-consumption organic synaptic transistors with high recognition accuracy enabled by Schottky barrier regulation
To build neuromorphic computing networks equivalent to the human brain, single artificial synaptic devices should exhibit low energy consumption down to femtojoules. However, most existing solutions for implementing low-energy synaptic devices based on an Ohmic contact are complex in structure or re...
Saved in:
Published in: | Science China materials 2023-11, Vol.66 (11), p.4453-4463 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | To build neuromorphic computing networks equivalent to the human brain, single artificial synaptic devices should exhibit low energy consumption down to femtojoules. However, most existing solutions for implementing low-energy synaptic devices based on an Ohmic contact are complex in structure or require specific materials, which hinder the further development of artificial neural networks. In this study, a Schottky-barrier-regulated organic synaptic transistor (SBROST) was reported. The device performance was improved by introducing the Schottky barrier at the contact interface between the source electrode and the semiconductor, thereby considerably reducing the energy consumption of one synaptic event compared with conventional OSTs with an Ohmic contact. The SBROST can not only reduce the device’s operating voltage and current but also possess a simple structure that can be utilized in different organic synaptic devices. Furthermore, high recognition accuracy at low energy consumption can be achieved by the SBROST. After 100 epochs, the SBROST-based handwritten artificial neural network exhibits excellent recognition accuracy (93.53%), which is close to the ideal accuracy (95.62%). The scheme of introducing a Schottky barrier into synaptic transistors offers a new perspective for constructing brain-like neural computing networks. |
---|---|
ISSN: | 2095-8226 2199-4501 |
DOI: | 10.1007/s40843-023-2573-6 |