Loading…
A Robust and Privacy-Aware Federated Learning Framework for Non-Intrusive Load Monitoring
With the rollout of smart meters, a vast amount of energy time-series became available from homes, enabling applications such as non-intrusive load monitoring (NILM). The inconspicuous collection of this data, however, poses a risk to the privacy of customers. Federated Learning (FL) eliminates the...
Saved in:
Published in: | IEEE transactions on sustainable computing 2024-09, Vol.9 (5), p.766-777 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | With the rollout of smart meters, a vast amount of energy time-series became available from homes, enabling applications such as non-intrusive load monitoring (NILM). The inconspicuous collection of this data, however, poses a risk to the privacy of customers. Federated Learning (FL) eliminates the problem of sharing raw data with a cloud service provider by allowing machine learning models to be trained in a collaborative fashion on decentralized data. Although several NILM techniques that rely on FL to train a deep neural network for identifying the energy consumption of individual appliances have been proposed in recent years, the robustness of these techniques to malicious users and their ability to fully protect the user privacy remain unexplored. In this paper, we present a robust and privacy-preserving FL-based framework to train a bidirectional transformer architecture for NILM. This framework takes advantage of a meta-learning algorithm to handle the data heterogeneity prevalent in real-world settings. The efficacy of the proposed framework is corroborated through comparative experiments using two real-world NILM datasets. The results show that this framework can attain an accuracy that is on par with a centrally-trained energy disaggregation model, while preserving user privacy. |
---|---|
ISSN: | 2377-3782 2377-3790 |
DOI: | 10.1109/TSUSC.2024.3370837 |