Loading…

Lightweight Privacy-Preserving Cross-Cluster Federated Learning With Heterogeneous Data

Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, wh...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information forensics and security 2024, Vol.19, p.7404-7419
Main Authors: Chen, Zekai, Yu, Shengxing, Chen, Farong, Wang, Fuyi, Liu, Ximeng, Deng, Robert H.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, which may lead to potential data leakage from uploading model updates. Despite intensive research on above-identical issues, existing approaches fail to balance robustness and privacy in FL. Furthermore, limiting model updates or iterative clustering tends to fall into local optimum problems in heterogeneous (Non-IID) scenarios. In this work, to address these deficiencies, we provide lightweight privacy-preserving cross-cluster federated learning (PrivCrFL) on Non-IID data, to trade off robustness and privacy in Non-IID settings. Our PrivCrFL exploits secure one-shot hierarchical clustering with cross-cluster shifting for optimizing sub-group convergences. Furthermore, we introduce intra-cluster learning and inter-cluster learning with separate aggregation for mutual learning between each group. We perform extensive experimental evaluations on three benchmark datasets and compare our results with state-of-the-art studies. The findings indicate that PrivCrFL offers a notable performance enhancement, with improvements ranging from 0.26\%~\uparrow to 1.35\%~\uparrow across different Non-IID settings. PrivCrFL also demonstrates a superior communication compression ratio in secure aggregation, outperforming current state-of-the-art works by 10.59%.
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2024.3435476