Loading…

Channel response-aware photonic neural network accelerators for high-speed inference through bandwidth-limited optics

Photonic neural network accelerators (PNNAs) have been lately brought into the spotlight as a new class of custom hardware that can leverage the maturity of photonic integration towards addressing the low-energy and computational power requirements of deep learning (DL) workloads. Transferring, howe...

Full description

Saved in:
Bibliographic Details
Published in:Optics express 2022-03, Vol.30 (7), p.10664-10671
Main Authors: Mourgias-Alexandris, G, Moralis-Pegios, M, Tsakyridis, A, Passalis, N, Kirtas, M, Tefas, A, Rutirawut, T, Gardes, F Y, Pleros, N
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Photonic neural network accelerators (PNNAs) have been lately brought into the spotlight as a new class of custom hardware that can leverage the maturity of photonic integration towards addressing the low-energy and computational power requirements of deep learning (DL) workloads. Transferring, however, the high-speed credentials of photonic circuitry into analogue neuromorphic computing necessitates a new set of DL training methods aligned along certain analogue photonic hardware characteristics. Herein, we present a novel channel response-aware (CRA) DL architecture that can address the implementation challenges of high-speed compute rates on bandwidth-limited photonic devices by incorporating their frequency response into the training procedure. The proposed architecture was validated both through software and experimentally by implementing the output layer of a neural network (NN) that classifies images of the MNIST dataset on an integrated SiPho coherent linear neuron (COLN) with a 3dB channel bandwidth of 7 GHz. A comparative analysis between the baseline and CRA model at 20, 25 and 32GMAC/sec/axon revealed respective experimental accuracies of 98.5%, 97.3% and 92.1% for the CRA model, outperforming the baseline model by 7.9%, 12.3% and 15.6%, respectively.
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.452803