Loading…
Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning
By moving the model computation from the cloud to edge devices, federated learning (FL) preserves user privacy without sending raw data to the centralized server. However, the personalized data generated by different edge devices are often statistically heterogeneous, which degrades the performance...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 384 |
container_issue | |
container_start_page | 379 |
container_title | |
container_volume | |
creator | Wang, Yuxin Zheng, Jingheng Ni, Wanli Tian, Hui |
description | By moving the model computation from the cloud to edge devices, federated learning (FL) preserves user privacy without sending raw data to the centralized server. However, the personalized data generated by different edge devices are often statistically heterogeneous, which degrades the performance of FL significantly. In addition, frequent information exchange between the server and devices imposes heavy communication overhead for the distributed model training of FL in spectral-limited wireless networks. With the aim of overcoming the statistical challenge in a communication-efficient manner, we propose an over-the-air federated meta-learning (Air-FedML) framework from the perspectives of integrating algorithm design and wireless transmission, which enables edge devices to collaboratively learn a shared model with good adaptation to heterogeneous data. To gain theoretical insights, we derive a closed-form expression of the convergence upper bound for the proposed Air-FedML framework to capture the effect of wireless communications on learning performance. Then, we formulate a non-convex optimization problem to minimize the derived upper bound by jointly optimizing the transmit and receive strategies under the constraints of power budget and aggregation error. Numerical results verify the effectiveness of the designed algorithm and show the superiority of our Air-FedML in adapting to statistically heterogeneous data compared to existing FL schemes. |
doi_str_mv | 10.1109/ICCWorkshops57953.2023.10283512 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10283512</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10283512</ieee_id><sourcerecordid>10283512</sourcerecordid><originalsourceid>FETCH-LOGICAL-i204t-763cf034be7ef05feed43f82371cab36d32ef0b7750be5fa49a5413fffd00ac53</originalsourceid><addsrcrecordid>eNo1kMFKAzEURaMgWGr_wEV2rlJf8pLJzLIMVgtTulFclszMSxttMyUZhPr1FtTVvRwuZ3EZe5AwlxKqx1Vdvw_pM--HUza2MjhXoHAuQZVopLpis8pWlwqICNZes4kqKi1UpeUtm-X8AQAoy7Is9ISt6yF-UdpR7Igvojucc8jcxZ5vTmM4hm83hiHywfPNZSbGPYlFSHxJPSU3Us_XNDrRkEsxxN0du_HukGn2l1P2tnx6rV9Es3le1YtGBAV6FLbAzgPqlix5MJ6o1-hLhVZ2rsWiR3XhrbUGWjLe6coZLdF73wO4zuCU3f96AxFtTykcXTpv_x_AH7fJVEQ</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning</title><source>IEEE Xplore All Conference Series</source><creator>Wang, Yuxin ; Zheng, Jingheng ; Ni, Wanli ; Tian, Hui</creator><creatorcontrib>Wang, Yuxin ; Zheng, Jingheng ; Ni, Wanli ; Tian, Hui</creatorcontrib><description>By moving the model computation from the cloud to edge devices, federated learning (FL) preserves user privacy without sending raw data to the centralized server. However, the personalized data generated by different edge devices are often statistically heterogeneous, which degrades the performance of FL significantly. In addition, frequent information exchange between the server and devices imposes heavy communication overhead for the distributed model training of FL in spectral-limited wireless networks. With the aim of overcoming the statistical challenge in a communication-efficient manner, we propose an over-the-air federated meta-learning (Air-FedML) framework from the perspectives of integrating algorithm design and wireless transmission, which enables edge devices to collaboratively learn a shared model with good adaptation to heterogeneous data. To gain theoretical insights, we derive a closed-form expression of the convergence upper bound for the proposed Air-FedML framework to capture the effect of wireless communications on learning performance. Then, we formulate a non-convex optimization problem to minimize the derived upper bound by jointly optimizing the transmit and receive strategies under the constraints of power budget and aggregation error. Numerical results verify the effectiveness of the designed algorithm and show the superiority of our Air-FedML in adapting to statistically heterogeneous data compared to existing FL schemes.</description><identifier>EISSN: 2694-2941</identifier><identifier>EISBN: 9798350333077</identifier><identifier>DOI: 10.1109/ICCWorkshops57953.2023.10283512</identifier><language>eng</language><publisher>IEEE</publisher><subject>Atmospheric modeling ; convergence analysis ; Data models ; Federated meta-learning ; Metalearning ; over-the-air computation ; Performance evaluation ; Servers ; transceiver optimization ; Upper bound ; Wireless networks</subject><ispartof>2023 IEEE International Conference on Communications Workshops (ICC Workshops), 2023, p.379-384</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10283512$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10283512$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wang, Yuxin</creatorcontrib><creatorcontrib>Zheng, Jingheng</creatorcontrib><creatorcontrib>Ni, Wanli</creatorcontrib><creatorcontrib>Tian, Hui</creatorcontrib><title>Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning</title><title>2023 IEEE International Conference on Communications Workshops (ICC Workshops)</title><addtitle>ICCWORKSHOPS</addtitle><description>By moving the model computation from the cloud to edge devices, federated learning (FL) preserves user privacy without sending raw data to the centralized server. However, the personalized data generated by different edge devices are often statistically heterogeneous, which degrades the performance of FL significantly. In addition, frequent information exchange between the server and devices imposes heavy communication overhead for the distributed model training of FL in spectral-limited wireless networks. With the aim of overcoming the statistical challenge in a communication-efficient manner, we propose an over-the-air federated meta-learning (Air-FedML) framework from the perspectives of integrating algorithm design and wireless transmission, which enables edge devices to collaboratively learn a shared model with good adaptation to heterogeneous data. To gain theoretical insights, we derive a closed-form expression of the convergence upper bound for the proposed Air-FedML framework to capture the effect of wireless communications on learning performance. Then, we formulate a non-convex optimization problem to minimize the derived upper bound by jointly optimizing the transmit and receive strategies under the constraints of power budget and aggregation error. Numerical results verify the effectiveness of the designed algorithm and show the superiority of our Air-FedML in adapting to statistically heterogeneous data compared to existing FL schemes.</description><subject>Atmospheric modeling</subject><subject>convergence analysis</subject><subject>Data models</subject><subject>Federated meta-learning</subject><subject>Metalearning</subject><subject>over-the-air computation</subject><subject>Performance evaluation</subject><subject>Servers</subject><subject>transceiver optimization</subject><subject>Upper bound</subject><subject>Wireless networks</subject><issn>2694-2941</issn><isbn>9798350333077</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1kMFKAzEURaMgWGr_wEV2rlJf8pLJzLIMVgtTulFclszMSxttMyUZhPr1FtTVvRwuZ3EZe5AwlxKqx1Vdvw_pM--HUza2MjhXoHAuQZVopLpis8pWlwqICNZes4kqKi1UpeUtm-X8AQAoy7Is9ISt6yF-UdpR7Igvojucc8jcxZ5vTmM4hm83hiHywfPNZSbGPYlFSHxJPSU3Us_XNDrRkEsxxN0du_HukGn2l1P2tnx6rV9Es3le1YtGBAV6FLbAzgPqlix5MJ6o1-hLhVZ2rsWiR3XhrbUGWjLe6coZLdF73wO4zuCU3f96AxFtTykcXTpv_x_AH7fJVEQ</recordid><startdate>20230528</startdate><enddate>20230528</enddate><creator>Wang, Yuxin</creator><creator>Zheng, Jingheng</creator><creator>Ni, Wanli</creator><creator>Tian, Hui</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20230528</creationdate><title>Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning</title><author>Wang, Yuxin ; Zheng, Jingheng ; Ni, Wanli ; Tian, Hui</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i204t-763cf034be7ef05feed43f82371cab36d32ef0b7750be5fa49a5413fffd00ac53</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Atmospheric modeling</topic><topic>convergence analysis</topic><topic>Data models</topic><topic>Federated meta-learning</topic><topic>Metalearning</topic><topic>over-the-air computation</topic><topic>Performance evaluation</topic><topic>Servers</topic><topic>transceiver optimization</topic><topic>Upper bound</topic><topic>Wireless networks</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Yuxin</creatorcontrib><creatorcontrib>Zheng, Jingheng</creatorcontrib><creatorcontrib>Ni, Wanli</creatorcontrib><creatorcontrib>Tian, Hui</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Yuxin</au><au>Zheng, Jingheng</au><au>Ni, Wanli</au><au>Tian, Hui</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning</atitle><btitle>2023 IEEE International Conference on Communications Workshops (ICC Workshops)</btitle><stitle>ICCWORKSHOPS</stitle><date>2023-05-28</date><risdate>2023</risdate><spage>379</spage><epage>384</epage><pages>379-384</pages><eissn>2694-2941</eissn><eisbn>9798350333077</eisbn><abstract>By moving the model computation from the cloud to edge devices, federated learning (FL) preserves user privacy without sending raw data to the centralized server. However, the personalized data generated by different edge devices are often statistically heterogeneous, which degrades the performance of FL significantly. In addition, frequent information exchange between the server and devices imposes heavy communication overhead for the distributed model training of FL in spectral-limited wireless networks. With the aim of overcoming the statistical challenge in a communication-efficient manner, we propose an over-the-air federated meta-learning (Air-FedML) framework from the perspectives of integrating algorithm design and wireless transmission, which enables edge devices to collaboratively learn a shared model with good adaptation to heterogeneous data. To gain theoretical insights, we derive a closed-form expression of the convergence upper bound for the proposed Air-FedML framework to capture the effect of wireless communications on learning performance. Then, we formulate a non-convex optimization problem to minimize the derived upper bound by jointly optimizing the transmit and receive strategies under the constraints of power budget and aggregation error. Numerical results verify the effectiveness of the designed algorithm and show the superiority of our Air-FedML in adapting to statistically heterogeneous data compared to existing FL schemes.</abstract><pub>IEEE</pub><doi>10.1109/ICCWorkshops57953.2023.10283512</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 2694-2941 |
ispartof | 2023 IEEE International Conference on Communications Workshops (ICC Workshops), 2023, p.379-384 |
issn | 2694-2941 |
language | eng |
recordid | cdi_ieee_primary_10283512 |
source | IEEE Xplore All Conference Series |
subjects | Atmospheric modeling convergence analysis Data models Federated meta-learning Metalearning over-the-air computation Performance evaluation Servers transceiver optimization Upper bound Wireless networks |
title | Convergence Analysis and Optimization of Over-the-Air Federated Meta-Learning |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T01%3A53%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Convergence%20Analysis%20and%20Optimization%20of%20Over-the-Air%20Federated%20Meta-Learning&rft.btitle=2023%20IEEE%20International%20Conference%20on%20Communications%20Workshops%20(ICC%20Workshops)&rft.au=Wang,%20Yuxin&rft.date=2023-05-28&rft.spage=379&rft.epage=384&rft.pages=379-384&rft.eissn=2694-2941&rft_id=info:doi/10.1109/ICCWorkshops57953.2023.10283512&rft.eisbn=9798350333077&rft_dat=%3Cieee_CHZPO%3E10283512%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i204t-763cf034be7ef05feed43f82371cab36d32ef0b7750be5fa49a5413fffd00ac53%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10283512&rfr_iscdi=true |