Loading…

Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs

With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enh...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2023-01, Vol.11, p.1-1
Main Authors: Mao, Zhengxiong, Li, Hui, Huang, Zuyuan, Yang, Chuanxu, Li, Yanan, Zhou, Zihao
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c359t-c008e9d2e5acb7071f827ec7b82dd6c78a09fb8686125dfd4eb39ee0a79581143
container_end_page 1
container_issue
container_start_page 1
container_title IEEE access
container_volume 11
creator Mao, Zhengxiong
Li, Hui
Huang, Zuyuan
Yang, Chuanxu
Li, Yanan
Zhou, Zihao
description With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enhance the privacy of the raw power load data of clients by frequently transmitting model updates. Concerning the increasing communication burden of resource-heterogeneous clients resulting from frequent communication with the server, a communication-efficient federated learning algorithm based on Compressed Model Updates and Lazy uploAd (CMULA-FL) was proposed to reduce the communication cost. CMULA-FL also integrates the error compensation strategy to improve the model utility. First, the compression operator is used to compress the transmitted model updates, of which large norms are uploaded to reduce the communication cost of each epoch and transmission frequency. Second, by measuring the error of compression and lazy upload, the error is accumulated to the next epoch to improve the model utility. Finally, based on simulation experiments on the benchmark power load data, the results show that the communication cost decreases at least 60% with controlled loss of model prediction compared with baseline.
doi_str_mv 10.1109/ACCESS.2023.3262171
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_1d6d23983a624008b274ec8cda7665d8</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10081317</ieee_id><doaj_id>oai_doaj_org_article_1d6d23983a624008b274ec8cda7665d8</doaj_id><sourcerecordid>2818367236</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-c008e9d2e5acb7071f827ec7b82dd6c78a09fb8686125dfd4eb39ee0a79581143</originalsourceid><addsrcrecordid>eNpNkU1LAzEQhoMoWLS_QA8LnrfmYzfJHsvSaqGg0Oo1ZJNJSWk3NbtF_PembpHOZYaZed8ZeBB6IHhCCK6ep3U9W60mFFM2YZRTIsgVGlHCq5yVjF9f1Ldo3HVbnEKmVilG6LMO-_2x9Ub3PrQ5OOeNh7bP5mAh6h5stgQdW99uMhdi9h6-IWbLoG02DxGM7vrTyLfZbAemj95ki7Du7tGN07sOxud8hz7ms3X9mi_fXhb1dJkbVlZ9btIjUFkKpTaNwII4SQUY0UhqLTdCaly5RnLJCS2tswU0rALAWlSlJKRgd2gx-Nqgt-oQ_V7HHxW0V3-NEDdKx96bHShiuaWskkxzWqS7DRUFGGmsFpyXViavp8HrEMPXEbpebcMxtul9RSWRjAvKeNpiw5aJoesiuP-rBKsTDzXwUCce6swjqR4HlQeAC0XiwIhgv147hdk</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2818367236</pqid></control><display><type>article</type><title>Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs</title><source>IEEE Open Access Journals</source><creator>Mao, Zhengxiong ; Li, Hui ; Huang, Zuyuan ; Yang, Chuanxu ; Li, Yanan ; Zhou, Zihao</creator><creatorcontrib>Mao, Zhengxiong ; Li, Hui ; Huang, Zuyuan ; Yang, Chuanxu ; Li, Yanan ; Zhou, Zihao</creatorcontrib><description>With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enhance the privacy of the raw power load data of clients by frequently transmitting model updates. Concerning the increasing communication burden of resource-heterogeneous clients resulting from frequent communication with the server, a communication-efficient federated learning algorithm based on Compressed Model Updates and Lazy uploAd (CMULA-FL) was proposed to reduce the communication cost. CMULA-FL also integrates the error compensation strategy to improve the model utility. First, the compression operator is used to compress the transmitted model updates, of which large norms are uploaded to reduce the communication cost of each epoch and transmission frequency. Second, by measuring the error of compression and lazy upload, the error is accumulated to the next epoch to improve the model utility. Finally, based on simulation experiments on the benchmark power load data, the results show that the communication cost decreases at least 60% with controlled loss of model prediction compared with baseline.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2023.3262171</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Algorithms ; Clients ; Communication ; Computational modeling ; Costs ; Data models ; Error analysis ; Error compensation ; Federated learning ; Forecasting ; Internet of Things ; lazy upload ; Load forecasting ; Load modeling ; Machine learning ; Mathematical models ; Norms ; Power load forecasting ; Predictive models ; Privacy ; quantization ; Servers</subject><ispartof>IEEE access, 2023-01, Vol.11, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c359t-c008e9d2e5acb7071f827ec7b82dd6c78a09fb8686125dfd4eb39ee0a79581143</cites><orcidid>0000-0002-3517-5690</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10081317$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27632,27923,27924,54932</link.rule.ids></links><search><creatorcontrib>Mao, Zhengxiong</creatorcontrib><creatorcontrib>Li, Hui</creatorcontrib><creatorcontrib>Huang, Zuyuan</creatorcontrib><creatorcontrib>Yang, Chuanxu</creatorcontrib><creatorcontrib>Li, Yanan</creatorcontrib><creatorcontrib>Zhou, Zihao</creatorcontrib><title>Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs</title><title>IEEE access</title><addtitle>Access</addtitle><description>With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enhance the privacy of the raw power load data of clients by frequently transmitting model updates. Concerning the increasing communication burden of resource-heterogeneous clients resulting from frequent communication with the server, a communication-efficient federated learning algorithm based on Compressed Model Updates and Lazy uploAd (CMULA-FL) was proposed to reduce the communication cost. CMULA-FL also integrates the error compensation strategy to improve the model utility. First, the compression operator is used to compress the transmitted model updates, of which large norms are uploaded to reduce the communication cost of each epoch and transmission frequency. Second, by measuring the error of compression and lazy upload, the error is accumulated to the next epoch to improve the model utility. Finally, based on simulation experiments on the benchmark power load data, the results show that the communication cost decreases at least 60% with controlled loss of model prediction compared with baseline.</description><subject>Algorithms</subject><subject>Clients</subject><subject>Communication</subject><subject>Computational modeling</subject><subject>Costs</subject><subject>Data models</subject><subject>Error analysis</subject><subject>Error compensation</subject><subject>Federated learning</subject><subject>Forecasting</subject><subject>Internet of Things</subject><subject>lazy upload</subject><subject>Load forecasting</subject><subject>Load modeling</subject><subject>Machine learning</subject><subject>Mathematical models</subject><subject>Norms</subject><subject>Power load forecasting</subject><subject>Predictive models</subject><subject>Privacy</subject><subject>quantization</subject><subject>Servers</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpNkU1LAzEQhoMoWLS_QA8LnrfmYzfJHsvSaqGg0Oo1ZJNJSWk3NbtF_PembpHOZYaZed8ZeBB6IHhCCK6ep3U9W60mFFM2YZRTIsgVGlHCq5yVjF9f1Ldo3HVbnEKmVilG6LMO-_2x9Ub3PrQ5OOeNh7bP5mAh6h5stgQdW99uMhdi9h6-IWbLoG02DxGM7vrTyLfZbAemj95ki7Du7tGN07sOxud8hz7ms3X9mi_fXhb1dJkbVlZ9btIjUFkKpTaNwII4SQUY0UhqLTdCaly5RnLJCS2tswU0rALAWlSlJKRgd2gx-Nqgt-oQ_V7HHxW0V3-NEDdKx96bHShiuaWskkxzWqS7DRUFGGmsFpyXViavp8HrEMPXEbpebcMxtul9RSWRjAvKeNpiw5aJoesiuP-rBKsTDzXwUCce6swjqR4HlQeAC0XiwIhgv147hdk</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Mao, Zhengxiong</creator><creator>Li, Hui</creator><creator>Huang, Zuyuan</creator><creator>Yang, Chuanxu</creator><creator>Li, Yanan</creator><creator>Zhou, Zihao</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-3517-5690</orcidid></search><sort><creationdate>20230101</creationdate><title>Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs</title><author>Mao, Zhengxiong ; Li, Hui ; Huang, Zuyuan ; Yang, Chuanxu ; Li, Yanan ; Zhou, Zihao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-c008e9d2e5acb7071f827ec7b82dd6c78a09fb8686125dfd4eb39ee0a79581143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Clients</topic><topic>Communication</topic><topic>Computational modeling</topic><topic>Costs</topic><topic>Data models</topic><topic>Error analysis</topic><topic>Error compensation</topic><topic>Federated learning</topic><topic>Forecasting</topic><topic>Internet of Things</topic><topic>lazy upload</topic><topic>Load forecasting</topic><topic>Load modeling</topic><topic>Machine learning</topic><topic>Mathematical models</topic><topic>Norms</topic><topic>Power load forecasting</topic><topic>Predictive models</topic><topic>Privacy</topic><topic>quantization</topic><topic>Servers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mao, Zhengxiong</creatorcontrib><creatorcontrib>Li, Hui</creatorcontrib><creatorcontrib>Huang, Zuyuan</creatorcontrib><creatorcontrib>Yang, Chuanxu</creatorcontrib><creatorcontrib>Li, Yanan</creatorcontrib><creatorcontrib>Zhou, Zihao</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE/IET Electronic Library</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Open Access: DOAJ - Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mao, Zhengxiong</au><au>Li, Hui</au><au>Huang, Zuyuan</au><au>Yang, Chuanxu</au><au>Li, Yanan</au><au>Zhou, Zihao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2023-01-01</date><risdate>2023</risdate><volume>11</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>With the construction of the modern power system, power load forecasting is significant to keep the electric Internet of Things in operation. However, it usually needs to collect massive power load data on the server and may face the problem of privacy leakage of raw data. Federated learning can enhance the privacy of the raw power load data of clients by frequently transmitting model updates. Concerning the increasing communication burden of resource-heterogeneous clients resulting from frequent communication with the server, a communication-efficient federated learning algorithm based on Compressed Model Updates and Lazy uploAd (CMULA-FL) was proposed to reduce the communication cost. CMULA-FL also integrates the error compensation strategy to improve the model utility. First, the compression operator is used to compress the transmitted model updates, of which large norms are uploaded to reduce the communication cost of each epoch and transmission frequency. Second, by measuring the error of compression and lazy upload, the error is accumulated to the next epoch to improve the model utility. Finally, based on simulation experiments on the benchmark power load data, the results show that the communication cost decreases at least 60% with controlled loss of model prediction compared with baseline.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2023.3262171</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-3517-5690</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2023-01, Vol.11, p.1-1
issn 2169-3536
2169-3536
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_1d6d23983a624008b274ec8cda7665d8
source IEEE Open Access Journals
subjects Algorithms
Clients
Communication
Computational modeling
Costs
Data models
Error analysis
Error compensation
Federated learning
Forecasting
Internet of Things
lazy upload
Load forecasting
Load modeling
Machine learning
Mathematical models
Norms
Power load forecasting
Predictive models
Privacy
quantization
Servers
title Communication-efficient Federated Learning for Power Load Forecasting in Electric IoTs
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T09%3A06%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Communication-efficient%20Federated%20Learning%20for%20Power%20Load%20Forecasting%20in%20Electric%20IoTs&rft.jtitle=IEEE%20access&rft.au=Mao,%20Zhengxiong&rft.date=2023-01-01&rft.volume=11&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2023.3262171&rft_dat=%3Cproquest_doaj_%3E2818367236%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c359t-c008e9d2e5acb7071f827ec7b82dd6c78a09fb8686125dfd4eb39ee0a79581143%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2818367236&rft_id=info:pmid/&rft_ieee_id=10081317&rfr_iscdi=true