Loading…

Accelerating Large-Scale Graph Neural Network Training on Crossbar Diet

ReRAM-based manycore architectures enable acceleration of Graph Neural Network (GNN) inference and training. GNNs exhibit characteristics of both DNNs and graph analytics. Hence, GNN training/inferencing on ReRAM-based manycore architectures gives rise to both computation and on-chip communication c...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on computer-aided design of integrated circuits and systems 2022-11, Vol.41 (11), p.1-1
Main Authors: Ogbogu, Chukwufumnanya, Arka, Aqeeb Iqbal, Joardar, Biresh Kumar, Doppa, Janardhan Rao, Li, Hai, Chakrabarty, Krishnendu, Pande, Partha Pratim
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:ReRAM-based manycore architectures enable acceleration of Graph Neural Network (GNN) inference and training. GNNs exhibit characteristics of both DNNs and graph analytics. Hence, GNN training/inferencing on ReRAM-based manycore architectures gives rise to both computation and on-chip communication challenges. In this work, we leverage model pruning and efficient graph storage to reduce the computation and communication bottlenecks associated with GNN training on ReRAM-based manycore accelerators. However, traditional pruning techniques are either targeted for inferencing only, or they are not crossbar-aware. In this work, we propose a GNN pruning technique called DietGNN. DietGNN is a crossbar-aware pruning technique that achieves high accuracy training and enables energy, area, and storage efficient computing on ReRAMbased manycore platforms. The DietGNN pruned model can be trained from scratch without any noticeable accuracy loss. Our experimental results show that when mapped on to a ReRAMbased manycore architecture, DietGNN can reduce the number of crossbars by over 90% and accelerate GNN training by 2.7× compared to its unpruned counterpart. In addition, DietGNN reduces energy consumption by more than 3.5× compared to the unpruned counterpart.
ISSN:0278-0070
1937-4151
DOI:10.1109/TCAD.2022.3197342