Loading…
Enabling Low-Power Charge-Domain Nonvolatile Computing-in-Memory (CIM) With Ferroelectric Memcapacitor
The explosive growth in data-centric computing driven by artificial intelligence (AI) and big data has surpassed the capabilities of the traditional von Neumann architecture. The architectural separation of computing and memory units results in substantial energy consumption, latency, and additional...
Saved in:
Published in: | IEEE transactions on electron devices 2024-04, Vol.71 (4), p.1-7 |
---|---|
Main Authors: | , , , , , , , , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The explosive growth in data-centric computing driven by artificial intelligence (AI) and big data has surpassed the capabilities of the traditional von Neumann architecture. The architectural separation of computing and memory units results in substantial energy consumption, latency, and additional hardware expenses. Therefore, a novel computing paradigm with low power consumption and high parallelism is urgently needed. One promising solution to address the memory wall challenge is capacitor-based computing-in-memory (CIM). In this article, we propose a hafnium-based ferroelectric memcapacitor (FE-memcap), which achieves a 103 ON/OFF ratio by combining ferroelectric polarization switching and p-i-n junction charge shielding. Furthermore, gate work-function engineering enables high ON/OFF ratios with small driven voltages, typically around 0.05 V. When integrated into the XNOR-binary neural network (BNN), the FE-memcap demonstrates the equivalent computational accuracy as traditional memristor-based counterparts. Importantly, the FE-memcap exhibits negligible static power consumption and significantly reduces dynamic power consumption by over 104 times. This attribute renders it highly suitable for event-driven edge computing applications and augments its potential for future AI applications. |
---|---|
ISSN: | 0018-9383 1557-9646 |
DOI: | 10.1109/TED.2024.3367965 |