Loading…

Yin-Yang: Programming Abstractions for Cross-Domain Multi-Acceleration

Field-programmable gate array (FPGA) accelerators offer performance and efficiency gains by narrowing the scope of acceleration to one algorithmic domain. However, real-life applications are often not limited to a single domain, which naturally makes Cross-Domain Multi-Acceleration a crucial next st...

Full description

Saved in:
Bibliographic Details
Published in:IEEE MICRO 2022-09, Vol.42 (5), p.89-98
Main Authors: Kim, Joon Kyung, Ahn, Byung Hoon, Kinzer, Sean, Ghodrati, Soroush, Mahapatra, Rohan, Yatham, Brahmendra, Wang, Shu-Ting, Kim, Dohee, Sarikhani, Parisa, Mahmoudi, Babak, Mahajan, Divya, Park, Jongse, Esmaeilzadeh, Hadi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Field-programmable gate array (FPGA) accelerators offer performance and efficiency gains by narrowing the scope of acceleration to one algorithmic domain. However, real-life applications are often not limited to a single domain, which naturally makes Cross-Domain Multi-Acceleration a crucial next step. The challenge is, existing FPGA accelerators are built upon their specific vertically specialized stacks, which prevents utilizing multiple accelerators from different domains. To that end, we propose a pair of dual abstractions, called Yin-Yang, which work in tandem and enable programmers to develop cross-domain applications using multiple accelerators on a FPGA. The Yin abstraction enables cross-domain algorithmic specification, while the Yang abstraction captures the accelerator capabilities. We also developed a dataflow virtual machine, dubbed Accelerator-Level Virtual Machine (XLVM), which transparently maps domain functions (Yin) to best-fit accelerator capabilities (Yang). With six real-world cross-domain applications, our evaluations show that Yin-Yang unlocks 29.4× speedup, while the best single-domain acceleration achieves 12.0×.
ISSN:0272-1732
1937-4143
DOI:10.1109/MM.2022.3189416