Loading…

pGBF: Personalized Gradient Boosting Forest

Due to the regulations to protect user data privacy and concerns about trade secrets, industrial organizations do not share user data with others. Federated learning makes it possible for multiple organizations to train a global model without revealing their data. Since, in real life, data distribut...

Full description

Saved in:
Bibliographic Details
Main Authors: Enkhtaivan, Batnyam, Teranishi, Isamu
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Due to the regulations to protect user data privacy and concerns about trade secrets, industrial organizations do not share user data with others. Federated learning makes it possible for multiple organizations to train a global model without revealing their data. Since, in real life, data distributions differ between organizations, it is necessary to personalize the model to have better performance for the data of a single participant. In this paper, we present the first personalized federated learning method for Gradient Boosting Decision Trees (GBDT) focusing on classification tasks, i.e., Personalized Gradient Boosting Forest (pGBF). Our method extends the existing federated learning method, Gradient Boosting Forest (GBF). Our experi-ments on three public datasets show that pGBF has better or similar performance to the existing methods, GBDT and GBF, in non-IID settings. Specifically, we find that our method has higher performance than GBDT when the data of the personalization target participant is small enough for GBDT model performance to be low. Moreover, pGBF has better performance than GBF when the data distributions among the participants are non-IID.
ISSN:2161-4407
DOI:10.1109/IJCNN54540.2023.10191289