Loading…
Secure Aggregation is Insecure: Category Inference Attack on Federated Learning
Federated learning allows a large number of resource-constrained clients to train a globally-shared model together without sharing local data. These clients usually have only a few classes (categories) of data for training, where the data distribution is non-iid (not independent identically distribu...
Saved in:
Published in: | IEEE transactions on dependable and secure computing 2023-01, Vol.20 (1), p.147-160 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Federated learning allows a large number of resource-constrained clients to train a globally-shared model together without sharing local data. These clients usually have only a few classes (categories) of data for training, where the data distribution is non-iid (not independent identically distributed). In this article, we put forward the concept of category privacy for the first time to indicate which classes of data a client has , which is an important but ignored privacy goal in the federated learning with non-iid data. Although secure aggregation protocols are designed for federated learning to protect the input privacy of clients, we perform the first systematic study on category inference attack and demonstrate that these protocols cannot fully protect category privacy. We design a differential selection strategy and two de-noising approaches to achieve the attack goal successfully. In our evaluation, we apply the attack to non-iid federated learning settings with various datasets. On MNIST, CIFAR-10, AG_news, and DBPedia dataset, our attack achieves >90\% >90% accuracy measured in F1-score in most cases. We further consider a possible detection method and propose two strategies to make the attack more inconspicuous. |
---|---|
ISSN: | 1545-5971 1941-0018 |
DOI: | 10.1109/TDSC.2021.3128679 |