Loading…

On the Convergence of Decentralized Federated Learning Under Imperfect Information Sharing

Most of the current literature focused on centralized learning is centered around the celebrated average-consensus paradigm and less attention is devoted to scenarios where the communication between the agents may be imperfect. This paper presents three different algorithms of Decentralized Federate...

Full description

Saved in:
Bibliographic Details
Published in:IEEE control systems letters 2023-01, Vol.7, p.1-1
Main Authors: Chellapandi, Vishnu Pandi, Upadhyay, Antesh, Hashemi, Abolfazl, Zak, Stanislaw H
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Most of the current literature focused on centralized learning is centered around the celebrated average-consensus paradigm and less attention is devoted to scenarios where the communication between the agents may be imperfect. This paper presents three different algorithms of Decentralized Federated Learning (DFL) in the presence of imperfect information sharing modeled as noisy communication channels. The first algorithm, Federated Noisy Decentralized Learning (FedNDL1) comes from the literature, where the noise is added to the algorithm parameters to simulate the scenario of the presence of noisy communication channels. This algorithm shares parameters to form a consensus with the clients based on a communication graph topology through a noisy communication channel. The proposed second algorithm (FedNDL2) is similar to the first algorithm but with added noise to the parameters and it performs the gossip averaging before the gradient optimization. The proposed third algorithm (FedNDL3), on the other hand, shares the gradients through noisy communication channels instead of the parameters. Theoretical and experimental results show that under imperfect information sharing, the third scheme that mixes gradients is more robust in the presence of a noisy channel compared with the algorithms from the literature that mix the parameters.
ISSN:2475-1456
2475-1456
DOI:10.1109/LCSYS.2023.3290470