Loading…
Deep Learning for Multi-Output Regression Using Gradient Boosting
This paper presents a novel methodology to address multi-output regression problems through the incorporation of deep-neural networks and gradient boosting. The proposed approach involves the use of dense layers as additive models within the Gradient Boosting framework using an auto transfer learnin...
Saved in:
Published in: | IEEE access 2024, Vol.12, p.17760-17772 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a novel methodology to address multi-output regression problems through the incorporation of deep-neural networks and gradient boosting. The proposed approach involves the use of dense layers as additive models within the Gradient Boosting framework using an auto transfer learning technique. At each boosting iteration, the deep model is cloned with the already trained layers frozen, and a new dense layer is concatenated to the frozen ones. Subsequently, only the weights of the newly added layer are trained in order to reduce the complexity of the learning task. Each layer is trained on the residuals of the squared loss function from previous iterations, resulting in the creation of a robust sequentially deep-trained neural network ensemble. Our experimental results demonstrate that the proposed approach leads to a significant improvement in the performance of the deep framework, resulting in more accurate predictions and improved model interpretability. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3359115 |