Loading…

Private and Secure Distributed Deep Learning: A Survey

Traditionally, deep learning practitioners would bring data into a central repository for model training and inference. Recent developments in distributed learning, such as federated learning and deep learning as a service (DLaaS) do not require centralized data and instead push computing to where t...

Full description

Saved in:
Bibliographic Details
Published in:ACM computing surveys 2024-11
Main Authors: Allaart, Corinne, Amiri, Saba, Bal, Henri, Belloum, Adam, Gommans, Leon, van Halteren, Aart, Klous, Sander
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Traditionally, deep learning practitioners would bring data into a central repository for model training and inference. Recent developments in distributed learning, such as federated learning and deep learning as a service (DLaaS) do not require centralized data and instead push computing to where the distributed datasets reside. These decentralized training schemes, however, introduce additional security and privacy challenges. This survey first structures the field of distributed learning into two main paradigms and then provides an overview of the recently published protective measures for each. This work highlights both secure training methods as well as private inference measures. Our analyses show that recent publications while being highly dependent on the problem definition, report progress in terms of security, privacy, and efficiency. Nevertheless, we also identify several current issues within the private and secure distributed deep learning (PSDDL) field that require more research. We discuss these issues and provide a general overview of how they might be resolved.
ISSN:0360-0300
1557-7341
DOI:10.1145/3703452