Loading…

Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation

Heterogeneous federated learning (HFL) enables collaborative learning across clients with diverse model architectures and data distributions while preserving privacy. However, existing HFL approaches often struggle to effectively address the challenges posed by model diversity, leading to suboptimal...

Full description

Saved in:
Bibliographic Details
Main Authors: Xie, Zaipeng, Xu, Han, Gao, Xing, Jiang, Junchen, Han, Ruiqian
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 11
container_issue
container_start_page 1
container_title
container_volume
creator Xie, Zaipeng
Xu, Han
Gao, Xing
Jiang, Junchen
Han, Ruiqian
description Heterogeneous federated learning (HFL) enables collaborative learning across clients with diverse model architectures and data distributions while preserving privacy. However, existing HFL approaches often struggle to effectively address the challenges posed by model diversity, leading to suboptimal performance and limited generalization ability. This paper pro-poses Fed2PKD, a novel HFL framework that tackles these challenges through a two-pronged knowledge distillation approach. Fed2PKD combines prototypical contrastive knowledge distillation to align client embeddings with global class prototypes and semi-supervised global knowledge distillation to capture global data characteristics. Experimental results on three benchmarks (MNIST, CIFAR10, and CIFAR100) demonstrate that Fed2PKD significantly outperforms existing state-of-the-art HFL methods, achieving average improvements of up to 30.53%, 13.89%, and 5.80 % in global model accuracy, respectively. Furthermore, Fed2PKD enables personalized models for each client, adapting to their specific data distributions and model architectures while benefiting from global knowledge sharing. Theoretical analysis provides convergence guarantees for Fed2PKD under realistic assumptions. Fed2PKD represents a significant step forward in HFL, unlocking the potential for privacy-preserving collaborative learning in real-world scenarios with model and data diversity.
doi_str_mv 10.1109/CLOUD62652.2024.00011
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10643935</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10643935</ieee_id><sourcerecordid>10643935</sourcerecordid><originalsourceid>FETCH-LOGICAL-i106t-2cd61b9eb077fd1b5d25c37d90297a08029a3f614bed2532d4ca9f269ec87383</originalsourceid><addsrcrecordid>eNotjtFOgzAYhauJicvkDTTpC4B_W9pS75Q5NcNsF3jrUugPqUEwhWzZ28uiV-fifOfLIeSOQcIYmPu82H6sFFeSJxx4mgAAYxckMtpkQoJQmRTqkiw4kyZWzMA1icbx64xBJiUTC_K5Rsd3m9UDfQretb5v6fvgsKMrf8Aw-ulEfU9nCIOd0NECbejP1MFbWh6HeBeGvp2LTT8cO3Qtzstx8l1nJz_0N-Sqsd2I0X8uSbl-LvPXuNi-vOWPRewZqCnmtVOsMliB1o1jlXRc1kI7A9xoC9kcVjSKpdX8Vgru0tqahiuDdaZFJpbk9k_rEXH_E_y3Daf9rE6FEVL8AoXVVNw</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation</title><source>IEEE Xplore All Conference Series</source><creator>Xie, Zaipeng ; Xu, Han ; Gao, Xing ; Jiang, Junchen ; Han, Ruiqian</creator><creatorcontrib>Xie, Zaipeng ; Xu, Han ; Gao, Xing ; Jiang, Junchen ; Han, Ruiqian</creatorcontrib><description>Heterogeneous federated learning (HFL) enables collaborative learning across clients with diverse model architectures and data distributions while preserving privacy. However, existing HFL approaches often struggle to effectively address the challenges posed by model diversity, leading to suboptimal performance and limited generalization ability. This paper pro-poses Fed2PKD, a novel HFL framework that tackles these challenges through a two-pronged knowledge distillation approach. Fed2PKD combines prototypical contrastive knowledge distillation to align client embeddings with global class prototypes and semi-supervised global knowledge distillation to capture global data characteristics. Experimental results on three benchmarks (MNIST, CIFAR10, and CIFAR100) demonstrate that Fed2PKD significantly outperforms existing state-of-the-art HFL methods, achieving average improvements of up to 30.53%, 13.89%, and 5.80 % in global model accuracy, respectively. Furthermore, Fed2PKD enables personalized models for each client, adapting to their specific data distributions and model architectures while benefiting from global knowledge sharing. Theoretical analysis provides convergence guarantees for Fed2PKD under realistic assumptions. Fed2PKD represents a significant step forward in HFL, unlocking the potential for privacy-preserving collaborative learning in real-world scenarios with model and data diversity.</description><identifier>EISSN: 2159-6190</identifier><identifier>EISBN: 9798350368536</identifier><identifier>DOI: 10.1109/CLOUD62652.2024.00011</identifier><identifier>CODEN: IEEPAD</identifier><language>eng</language><publisher>IEEE</publisher><subject>Adaptation models ; Cloud computing ; Computational modeling ; data heterogeneity ; Data privacy ; Federated learning ; Heterogeneous federated learning ; Knowledge engineering ; model diver-sity ; Prototypes ; two-pronged knowledge distillation</subject><ispartof>2024 IEEE 17th International Conference on Cloud Computing (CLOUD), 2024, p.1-11</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10643935$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10643935$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Xie, Zaipeng</creatorcontrib><creatorcontrib>Xu, Han</creatorcontrib><creatorcontrib>Gao, Xing</creatorcontrib><creatorcontrib>Jiang, Junchen</creatorcontrib><creatorcontrib>Han, Ruiqian</creatorcontrib><title>Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation</title><title>2024 IEEE 17th International Conference on Cloud Computing (CLOUD)</title><addtitle>CLOUD</addtitle><description>Heterogeneous federated learning (HFL) enables collaborative learning across clients with diverse model architectures and data distributions while preserving privacy. However, existing HFL approaches often struggle to effectively address the challenges posed by model diversity, leading to suboptimal performance and limited generalization ability. This paper pro-poses Fed2PKD, a novel HFL framework that tackles these challenges through a two-pronged knowledge distillation approach. Fed2PKD combines prototypical contrastive knowledge distillation to align client embeddings with global class prototypes and semi-supervised global knowledge distillation to capture global data characteristics. Experimental results on three benchmarks (MNIST, CIFAR10, and CIFAR100) demonstrate that Fed2PKD significantly outperforms existing state-of-the-art HFL methods, achieving average improvements of up to 30.53%, 13.89%, and 5.80 % in global model accuracy, respectively. Furthermore, Fed2PKD enables personalized models for each client, adapting to their specific data distributions and model architectures while benefiting from global knowledge sharing. Theoretical analysis provides convergence guarantees for Fed2PKD under realistic assumptions. Fed2PKD represents a significant step forward in HFL, unlocking the potential for privacy-preserving collaborative learning in real-world scenarios with model and data diversity.</description><subject>Adaptation models</subject><subject>Cloud computing</subject><subject>Computational modeling</subject><subject>data heterogeneity</subject><subject>Data privacy</subject><subject>Federated learning</subject><subject>Heterogeneous federated learning</subject><subject>Knowledge engineering</subject><subject>model diver-sity</subject><subject>Prototypes</subject><subject>two-pronged knowledge distillation</subject><issn>2159-6190</issn><isbn>9798350368536</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2024</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotjtFOgzAYhauJicvkDTTpC4B_W9pS75Q5NcNsF3jrUugPqUEwhWzZ28uiV-fifOfLIeSOQcIYmPu82H6sFFeSJxx4mgAAYxckMtpkQoJQmRTqkiw4kyZWzMA1icbx64xBJiUTC_K5Rsd3m9UDfQretb5v6fvgsKMrf8Aw-ulEfU9nCIOd0NECbejP1MFbWh6HeBeGvp2LTT8cO3Qtzstx8l1nJz_0N-Sqsd2I0X8uSbl-LvPXuNi-vOWPRewZqCnmtVOsMliB1o1jlXRc1kI7A9xoC9kcVjSKpdX8Vgru0tqahiuDdaZFJpbk9k_rEXH_E_y3Daf9rE6FEVL8AoXVVNw</recordid><startdate>20240707</startdate><enddate>20240707</enddate><creator>Xie, Zaipeng</creator><creator>Xu, Han</creator><creator>Gao, Xing</creator><creator>Jiang, Junchen</creator><creator>Han, Ruiqian</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20240707</creationdate><title>Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation</title><author>Xie, Zaipeng ; Xu, Han ; Gao, Xing ; Jiang, Junchen ; Han, Ruiqian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i106t-2cd61b9eb077fd1b5d25c37d90297a08029a3f614bed2532d4ca9f269ec87383</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptation models</topic><topic>Cloud computing</topic><topic>Computational modeling</topic><topic>data heterogeneity</topic><topic>Data privacy</topic><topic>Federated learning</topic><topic>Heterogeneous federated learning</topic><topic>Knowledge engineering</topic><topic>model diver-sity</topic><topic>Prototypes</topic><topic>two-pronged knowledge distillation</topic><toplevel>online_resources</toplevel><creatorcontrib>Xie, Zaipeng</creatorcontrib><creatorcontrib>Xu, Han</creatorcontrib><creatorcontrib>Gao, Xing</creatorcontrib><creatorcontrib>Jiang, Junchen</creatorcontrib><creatorcontrib>Han, Ruiqian</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Xie, Zaipeng</au><au>Xu, Han</au><au>Gao, Xing</au><au>Jiang, Junchen</au><au>Han, Ruiqian</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation</atitle><btitle>2024 IEEE 17th International Conference on Cloud Computing (CLOUD)</btitle><stitle>CLOUD</stitle><date>2024-07-07</date><risdate>2024</risdate><spage>1</spage><epage>11</epage><pages>1-11</pages><eissn>2159-6190</eissn><eisbn>9798350368536</eisbn><coden>IEEPAD</coden><abstract>Heterogeneous federated learning (HFL) enables collaborative learning across clients with diverse model architectures and data distributions while preserving privacy. However, existing HFL approaches often struggle to effectively address the challenges posed by model diversity, leading to suboptimal performance and limited generalization ability. This paper pro-poses Fed2PKD, a novel HFL framework that tackles these challenges through a two-pronged knowledge distillation approach. Fed2PKD combines prototypical contrastive knowledge distillation to align client embeddings with global class prototypes and semi-supervised global knowledge distillation to capture global data characteristics. Experimental results on three benchmarks (MNIST, CIFAR10, and CIFAR100) demonstrate that Fed2PKD significantly outperforms existing state-of-the-art HFL methods, achieving average improvements of up to 30.53%, 13.89%, and 5.80 % in global model accuracy, respectively. Furthermore, Fed2PKD enables personalized models for each client, adapting to their specific data distributions and model architectures while benefiting from global knowledge sharing. Theoretical analysis provides convergence guarantees for Fed2PKD under realistic assumptions. Fed2PKD represents a significant step forward in HFL, unlocking the potential for privacy-preserving collaborative learning in real-world scenarios with model and data diversity.</abstract><pub>IEEE</pub><doi>10.1109/CLOUD62652.2024.00011</doi><tpages>11</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2159-6190
ispartof 2024 IEEE 17th International Conference on Cloud Computing (CLOUD), 2024, p.1-11
issn 2159-6190
language eng
recordid cdi_ieee_primary_10643935
source IEEE Xplore All Conference Series
subjects Adaptation models
Cloud computing
Computational modeling
data heterogeneity
Data privacy
Federated learning
Heterogeneous federated learning
Knowledge engineering
model diver-sity
Prototypes
two-pronged knowledge distillation
title Fed2PKD: Bridging Model Diversity in Federated Learning via Two-Pronged Knowledge Distillation
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T20%3A59%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Fed2PKD:%20Bridging%20Model%20Diversity%20in%20Federated%20Learning%20via%20Two-Pronged%20Knowledge%20Distillation&rft.btitle=2024%20IEEE%2017th%20International%20Conference%20on%20Cloud%20Computing%20(CLOUD)&rft.au=Xie,%20Zaipeng&rft.date=2024-07-07&rft.spage=1&rft.epage=11&rft.pages=1-11&rft.eissn=2159-6190&rft.coden=IEEPAD&rft_id=info:doi/10.1109/CLOUD62652.2024.00011&rft.eisbn=9798350368536&rft_dat=%3Cieee_CHZPO%3E10643935%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i106t-2cd61b9eb077fd1b5d25c37d90297a08029a3f614bed2532d4ca9f269ec87383%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10643935&rfr_iscdi=true