Loading…
Federated Learning with Dynamic Model Exchange
Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-wor...
Saved in:
Published in: | Electronics (Basel) 2022-05, Vol.11 (10), p.1530 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3 |
---|---|
cites | cdi_FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3 |
container_end_page | |
container_issue | 10 |
container_start_page | 1530 |
container_title | Electronics (Basel) |
container_volume | 11 |
creator | Hilberger, Hannes Hanke, Sten Bödenler, Markus |
description | Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting. |
doi_str_mv | 10.3390/electronics11101530 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2670126148</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2670126148</sourcerecordid><originalsourceid>FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3</originalsourceid><addsrcrecordid>eNptULFOwzAUtBBIVKVfwBKJOeXZL3HsEZWWIgWxwGw59kubKk2KnQr69wSVgYFb7k463UnH2C2HOaKGe2rJDaHvGhc558BzhAs2EVDoVAstLv_oazaLcQcjNEeFMGHzFXkKdiCflGRD13Sb5LMZtsnjqbP7xiUvvac2WX65re02dMOuattGmv3ylL2vlm-LdVq-Pj0vHsrUoRBD6oqschatF6Aqr5zjjsirKgOU4CtZVCovRo-Zy7BQinyuJNWiBi1J5RVO2d259xD6jyPFwez6Y-jGSSNkAVxInqkxheeUC32MgWpzCM3ehpPhYH6-Mf98g99f7Vl7</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2670126148</pqid></control><display><type>article</type><title>Federated Learning with Dynamic Model Exchange</title><source>ProQuest - Publicly Available Content Database</source><source>Coronavirus Research Database</source><creator>Hilberger, Hannes ; Hanke, Sten ; Bödenler, Markus</creator><creatorcontrib>Hilberger, Hannes ; Hanke, Sten ; Bödenler, Markus</creatorcontrib><description>Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting.</description><identifier>ISSN: 2079-9292</identifier><identifier>EISSN: 2079-9292</identifier><identifier>DOI: 10.3390/electronics11101530</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Algorithms ; Blockchain ; Data integrity ; Datasets ; Dynamic models ; Electronic health records ; Federated learning ; Infrastructure ; Machine learning ; Noise ; Privacy ; Security ; Servers ; Source code</subject><ispartof>Electronics (Basel), 2022-05, Vol.11 (10), p.1530</ispartof><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3</citedby><cites>FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3</cites><orcidid>0000-0003-3867-0651 ; 0000-0003-3833-4252 ; 0000-0001-6018-7821</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2670126148/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2670126148?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,25753,27924,27925,37012,38516,43895,44590,74412,75126</link.rule.ids></links><search><creatorcontrib>Hilberger, Hannes</creatorcontrib><creatorcontrib>Hanke, Sten</creatorcontrib><creatorcontrib>Bödenler, Markus</creatorcontrib><title>Federated Learning with Dynamic Model Exchange</title><title>Electronics (Basel)</title><description>Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting.</description><subject>Algorithms</subject><subject>Blockchain</subject><subject>Data integrity</subject><subject>Datasets</subject><subject>Dynamic models</subject><subject>Electronic health records</subject><subject>Federated learning</subject><subject>Infrastructure</subject><subject>Machine learning</subject><subject>Noise</subject><subject>Privacy</subject><subject>Security</subject><subject>Servers</subject><subject>Source code</subject><issn>2079-9292</issn><issn>2079-9292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>COVID</sourceid><sourceid>PIMPY</sourceid><recordid>eNptULFOwzAUtBBIVKVfwBKJOeXZL3HsEZWWIgWxwGw59kubKk2KnQr69wSVgYFb7k463UnH2C2HOaKGe2rJDaHvGhc558BzhAs2EVDoVAstLv_oazaLcQcjNEeFMGHzFXkKdiCflGRD13Sb5LMZtsnjqbP7xiUvvac2WX65re02dMOuattGmv3ylL2vlm-LdVq-Pj0vHsrUoRBD6oqschatF6Aqr5zjjsirKgOU4CtZVCovRo-Zy7BQinyuJNWiBi1J5RVO2d259xD6jyPFwez6Y-jGSSNkAVxInqkxheeUC32MgWpzCM3ehpPhYH6-Mf98g99f7Vl7</recordid><startdate>20220511</startdate><enddate>20220511</enddate><creator>Hilberger, Hannes</creator><creator>Hanke, Sten</creator><creator>Bödenler, Markus</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>COVID</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L7M</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0003-3867-0651</orcidid><orcidid>https://orcid.org/0000-0003-3833-4252</orcidid><orcidid>https://orcid.org/0000-0001-6018-7821</orcidid></search><sort><creationdate>20220511</creationdate><title>Federated Learning with Dynamic Model Exchange</title><author>Hilberger, Hannes ; Hanke, Sten ; Bödenler, Markus</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Blockchain</topic><topic>Data integrity</topic><topic>Datasets</topic><topic>Dynamic models</topic><topic>Electronic health records</topic><topic>Federated learning</topic><topic>Infrastructure</topic><topic>Machine learning</topic><topic>Noise</topic><topic>Privacy</topic><topic>Security</topic><topic>Servers</topic><topic>Source code</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hilberger, Hannes</creatorcontrib><creatorcontrib>Hanke, Sten</creatorcontrib><creatorcontrib>Bödenler, Markus</creatorcontrib><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>Coronavirus Research Database</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest - Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Electronics (Basel)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hilberger, Hannes</au><au>Hanke, Sten</au><au>Bödenler, Markus</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Federated Learning with Dynamic Model Exchange</atitle><jtitle>Electronics (Basel)</jtitle><date>2022-05-11</date><risdate>2022</risdate><volume>11</volume><issue>10</issue><spage>1530</spage><pages>1530-</pages><issn>2079-9292</issn><eissn>2079-9292</eissn><abstract>Large amounts of data are needed to train accurate robust machine learning models, but the acquisition of these data is complicated due to strict regulations. While many business sectors often have unused data silos, researchers face the problem of not being able to obtain a large amount of real-world data. This is especially true in the healthcare sector, since transferring these data is often associated with bureaucratic overhead because of, for example, increased security requirements and privacy laws. Federated Learning should circumvent this problem and allow training to take place directly on the data owner’s side without sending them to a central location such as a server. Currently, there exist several frameworks for this purpose such as TensorFlow Federated, Flower, or PySyft/PyGrid. These frameworks define models for both the server and client since the coordination of the training is performed by a server. Here, we present a practical method that contains a dynamic exchange of the model, so that the model is not statically stored in source code. During this process, the model architecture and training configuration are defined by the researchers and sent to the server, which passes the settings to the clients. In addition, the model is transformed by the data owner to incorporate Differential Privacy. To trace a comparison between central learning and the impact of Differential Privacy, performance and security evaluation experiments were conducted. It was found that Federated Learning can achieve results on par with centralised learning and that the use of Differential Privacy can improve the robustness of the model against Membership Inference Attacks in an honest-but-curious setting.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/electronics11101530</doi><orcidid>https://orcid.org/0000-0003-3867-0651</orcidid><orcidid>https://orcid.org/0000-0003-3833-4252</orcidid><orcidid>https://orcid.org/0000-0001-6018-7821</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2079-9292 |
ispartof | Electronics (Basel), 2022-05, Vol.11 (10), p.1530 |
issn | 2079-9292 2079-9292 |
language | eng |
recordid | cdi_proquest_journals_2670126148 |
source | ProQuest - Publicly Available Content Database; Coronavirus Research Database |
subjects | Algorithms Blockchain Data integrity Datasets Dynamic models Electronic health records Federated learning Infrastructure Machine learning Noise Privacy Security Servers Source code |
title | Federated Learning with Dynamic Model Exchange |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T12%3A53%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Federated%20Learning%20with%20Dynamic%20Model%20Exchange&rft.jtitle=Electronics%20(Basel)&rft.au=Hilberger,%20Hannes&rft.date=2022-05-11&rft.volume=11&rft.issue=10&rft.spage=1530&rft.pages=1530-&rft.issn=2079-9292&rft.eissn=2079-9292&rft_id=info:doi/10.3390/electronics11101530&rft_dat=%3Cproquest_cross%3E2670126148%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c322t-c74bca3ad208bd8cc1ceed8b40360db67b857d8b34c43788ed586ef2f096e85b3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2670126148&rft_id=info:pmid/&rfr_iscdi=true |