Loading…

CE-Fed: Communication efficient multi-party computation enabled federated learning

Federated learning (FL) allows a number of parties collectively train models without revealing private datasets. There is a possibility of extracting personal or confidential data from the shared models even-though sharing of raw data is prevented by federated learning. Secure Multi Party Computatio...

Full description

Saved in:
Bibliographic Details
Published in:Array (New York) 2022-09, Vol.15, p.100207, Article 100207
Main Authors: Kanagavelu, Renuga, Wei, Qingsong, Li, Zengxiang, Zhang, Haibin, Samsudin, Juniarto, Yang, Yechao, Goh, Rick Siow Mong, Wang, Shangguang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Federated learning (FL) allows a number of parties collectively train models without revealing private datasets. There is a possibility of extracting personal or confidential data from the shared models even-though sharing of raw data is prevented by federated learning. Secure Multi Party Computation (MPC) is leveraged to aggregate the locally-trained models in a privacy preserving manner. However, it results in high communication cost and poor scalability in a decentralized environment. We design a novel communication-efficient MPC enabled federated learning called CE-Fed. In particular, the proposed CE-Fed is a hierarchical mechanism which forms model aggregation committee with a small number of members and aggregates the global model only among committee members, instead of all participants. We develop a prototype and demonstrate the effectiveness of our mechanism with different datasets. Our proposed CE-Fed achieves high accuracy, communication efficiency and scalability without compromising privacy.
ISSN:2590-0056
2590-0056
DOI:10.1016/j.array.2022.100207