Loading…
Decentralized Machine Learning over the Internet
Decentralized machine learning has attracted extensive research interest in recent years. Compared to its centralized counterpart that relies on a server to collect and disseminate messages, decentralized machine learning relies on message passing among neighboring nodes, and thus avoids the communi...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Decentralized machine learning has attracted extensive research interest in recent years. Compared to its centralized counterpart that relies on a server to collect and disseminate messages, decentralized machine learning relies on message passing among neighboring nodes, and thus avoids the communication bottleneck of the server. However, most of the existing decentralized machine learning algorithms operate within data centers that are equipped with multiple CPUs and/or GPUs, assuming that the underlying communication networks are error-free. In this paper, we implement decentralized parallel stochastic gradient descent (D-PSGD) over multiple computers connected with the Internet which is relatively unreliable. To cope with the inevitable packet losses caused by the Internet environment, we modify D-PSGD by allowing it to use stale messages. Numerical experiments confirm the performance gain of decentralized machine learning over its centralized counterpart in terms of runtime, as well as the robustness of the modified D-PSGD to packet losses. |
---|---|
ISSN: | 2161-2927 |
DOI: | 10.23919/CCC55666.2022.9901831 |