Loading…

22.1 A 1.1V 16GB 640GB/s HBM2E DRAM with a Data-Bus Window-Extension Technique and a Synergetic On-Die ECC Scheme

Rapidly evolving artificial intelligence (Al) technology, such as deep learning, has been successfully deployed in various applications: such as image recognition, health care, and autonomous driving. Such rapid evolution and successful deployment of Al technology have been possible owing to the eme...

Full description

Saved in:
Bibliographic Details
Main Authors: Oh, Chi-Sung, Chun, Ki Chul, Byun, Young-Yong, Kim, Yong-Ki, Kim, So-Young, Ryu, Yesin, Park, Jaewon, Kim, Sinho, Cha, Sanguhn, Shin, Donghak, Lee, Jungyu, Son, Jong-Pil, Ho, Byung-Kyu, Cho, Seong-Jin, Kil, Beomyong, Ahn, Sungoh, Lim, Baekmin, Park, Yongsik, Lee, Kijun, Lee, Myung-Kyu, Baek, Seungduk, Noh, Junyong, Lee, Jae-Wook, Lee, Seungseob, Kim, Sooyoung, Lim, Botak, Choi, Seouk-Kyu, Kim, Jin-Guk, Choi, Hye-In, Kwon, Hyuk-Jun, Kong, Jun Jin, Sohn, Kyomin, Kim, Nam Sung, Park, Kwang-Il, Lee, Jung-Bae
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Rapidly evolving artificial intelligence (Al) technology, such as deep learning, has been successfully deployed in various applications: such as image recognition, health care, and autonomous driving. Such rapid evolution and successful deployment of Al technology have been possible owing to the emergence of accelerators, such as GPUs and TPUs, that have a higher data throughput. This, in turn, requires an enhanced memory system with large capacity and high bandwidth [1]; HBM has been the most preferred high-bandwidth memory technology due to its high-speed and low-power characteristics, and 1024 IOs facilitated by 2.5D silicon interposer technology, as well as large capacity realized by through-silicon via (TSV) stack technology [2]. Previous-generation HBM2 supports 8GB capacity with a stack of 8 DRAM dies (i.e., 8-high stack) and 341GB/s (2.7Gb/s/pin) bandwidth [3]. The HBM industry trend has been a speed improvement of 15~20% every year, while capacity increases by 1.5-2x every two years. In this paper, we present a 16GB HBM2E with circuit and design techniques to increase its bandwidth up to 640GB/s (5Gb/s/pin), while providing stable bit-cell operation in the 2 nd generation of a 10nm DRAM process: featuring (1) a data-bus window-extension technique to cope with reduced t_{cco} , (2) a power delivery network (PDN) designed for stable operation at a high speed, (3) a synergetic on-die ECC scheme to reliably provide large capacity, and (4) an MBIST solution to efficiently test large capacity memory at a high speed.
ISSN:2376-8606
DOI:10.1109/ISSCC19947.2020.9063110