Loading…

FedME2: Memory Evaluation & Erase Promoting Federated Unlearning in DTMN

Digital Twins (DTs) can generate digital replicas for mobile networks (MNs) that accurately reflect the state of MN. Machine learning (ML) models trained in DT for MN (DTMN) virtual environments can be more robustly implemented in MN. This can avoid the training difficulties and runtime errors cause...

Full description

Saved in:
Bibliographic Details
Published in:IEEE journal on selected areas in communications 2023-11, Vol.41 (11), p.1-1
Main Authors: Xia, Hui, Xu, Shuo, Pei, Jiaming, Zhang, Rui, Yu, Zhi, Zou, Weitao, Wang, Lukun, Liu, Chao
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Digital Twins (DTs) can generate digital replicas for mobile networks (MNs) that accurately reflect the state of MN. Machine learning (ML) models trained in DT for MN (DTMN) virtual environments can be more robustly implemented in MN. This can avoid the training difficulties and runtime errors caused by MN instability and multiple failures. However, when using data from various devices in the MN system, DTs must prioritize data privacy. Federated learning (FL) enables the construction of models without data leaving devices to protect DTMN data privacy. Nevertheless, FL's privacy protection needs further improvement for it only guarantees device-level data ownership but ignores that models may retain private information from data. Therefore, this paper focuses on data forgetting in privacy protection, and proposes a novel FL-based unlearning framework (FedME 2 ), which contains MEval and MErase modules. Guided by memory evaluation information from MEval and employing MErase's multi-loss training approach, FedME 2 gets accurate data forgetting in DTMN. In four DTMN virtual environments, FedME 2 achieves an average data forgetting rate of approximately 75% for global models under FL and kept the influence on global models' accuracy below 4%. FedME 2 has better data forgetting and improves DTMN data privacy protection while guaranteeing model accuracy.
ISSN:0733-8716
1558-0008
DOI:10.1109/JSAC.2023.3310049