Loading…

Remote Identification of Neural Network FPGA Accelerators by Power Fingerprints

Machine learning acceleration has become increasingly popular in recent years, with machine learning-as-a-service (MLaaS) scenarios offering convenient and efficient ways to access pre-trained neural network models on devices such as cloud FPGAs. However, the ease of access and use also raises conce...

Full description

Saved in:
Bibliographic Details
Main Authors: Meyers, Vincent, Hefenbrock, Michael, Gnad, Dennis, Tahoori, Mehdi
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Machine learning acceleration has become increasingly popular in recent years, with machine learning-as-a-service (MLaaS) scenarios offering convenient and efficient ways to access pre-trained neural network models on devices such as cloud FPGAs. However, the ease of access and use also raises concerns over model theft or misuse through model manipulation. To address these concerns, this paper proposes a method for identifying neural network models in MLaaS scenarios by their unique power consumption. Current fingerprinting methods for neural networks rely on input/output pairs or characteristic of the decision boundary, which might not always be accessible in more complex systems. Our proposed method utilizes unique power characteristics of the black-box neural network accelerator to extract a fingerprint by measuring the voltage fluctuations of the device when querying specially crafted inputs. We take advantage of the fact that the power consumption of the accelerator varies depending on the input being processed. For evaluation of our method we conduct 200 fingerprint extraction and matching experiments and the results confirm that the proposed method can distinguish between correct and incorrect models in 100% of the cases. Furthermore, we show that the fingerprint is robust to environmental and chip-to-chip variations.
ISSN:1946-1488
DOI:10.1109/FPL60245.2023.00044