Loading…
Data-Quality-Driven Federated Learning for Optimizing Communication Costs
Federated Learning (FL) is a distributed machine learning approach that allows mobile devices to train a global model cooperatively, without uploading privacy-sensitive data to the cloud. To improve the accuracy of the model, the model needs to be updated frequently. However, FL system under mobile...
Saved in:
Main Authors: | , , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Federated Learning (FL) is a distributed machine learning approach that allows mobile devices to train a global model cooperatively, without uploading privacy-sensitive data to the cloud. To improve the accuracy of the model, the model needs to be updated frequently. However, FL system under mobile edge-end has to adapt to limited communication bandwidth. At the same time, the property of statistical heterogeneity in FL means that we cannot blindly reduce the number of clients. We found that the accuracy of the global model depends greatly on the clients whose data is more similar and balanced. In this paper, we first define the "data quality" of clients to appraise the impact of data on a client to the accuracy of the global model. Then, based on the data quality, we design a client selection to optimize the communication costs of FL by screening out the clients that determine the accuracy of the global model. To the authors' best knowledge, this is the first paper to save the costs of FL by assessing data quality of clients. Experimental results show that on imbalanced SVHN, the communication cost of our algorithm is reduced by 56% compared with vanilla FL. Compared with vanilla FL, requires all clients to participate in training, our algorithm shows -1.58% and +3.19% and -0.01% of average accuracy on the imbalanced CIFAR10, imbalanced FMNIST and imbalanced SVHN datasets, respectively. In other words, our algorithm can reduce communication overhead with negligible degradation of accuracy. |
---|---|
ISSN: | 2690-5965 |
DOI: | 10.1109/ICPADS60453.2023.00210 |