Loading…
Client-Free Federated Unlearning via Training Reconstruction with Anchor Subspace Calibration
Federated learning (FL) model usually needs to forget what it has learned from a certain client for various considerations, which gives birth to the federated unlearning (FU) technique. Due to the distributed nature of FL, removing a specific client's contribution from the global model potentia...
Saved in:
Main Authors: | , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Federated learning (FL) model usually needs to forget what it has learned from a certain client for various considerations, which gives birth to the federated unlearning (FU) technique. Due to the distributed nature of FL, removing a specific client's contribution from the global model potentially requires the cooperation of all participants, making FU difficult to apply in real-world scenarios. This paper proposes a simple-yet-effective client-free FU algorithm that runs solely on the central server. The algorithm utilizes the cached historical updates of the initial training from clients to rebuild the training after excluding the target client. To circumvent the issue of adaptivity, which is the key challenge for training reconstruction, we leverage the low-dimensional structure of gradient space in deep networks. Specifically, we propose to project the historical gradients to a low-dimensional subspace, which is given by the top gradient eigenspace on a small public dataset. According to experiments on three canonical datasets, our method achieves efficient unlearning while also preserving a high-level model utility. |
---|---|
ISSN: | 2379-190X |
DOI: | 10.1109/ICASSP48485.2024.10447085 |