Loading…

Ensembles of Gradient Boosting Recurrent Neural Network for Time Series Data Prediction

Ensemble deep learning can combine strengths of neural network and ensemble learning and gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support or demand large integrated models. To solve these problems, in this paper, Ensembles of Gradient...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, p.1-1
Main Authors: Sang, Shiqing, Qu, Fangfang, Nie, Pengcheng
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Ensemble deep learning can combine strengths of neural network and ensemble learning and gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support or demand large integrated models. To solve these problems, in this paper, Ensembles of Gradient Boosting Recurrent Neural Network (EGB-RNN) is proposed, which combines the gradient boosting ensemble framework with three types of recurrent neural network models, namely Minimal Gated Unit (MGU), Gated Recurrent Unit (GRU) and Long Short-Term Memory (LSTM). RNN model is used as base learner to integrate an ensemble learner, through the way of gradient boosting. Meanwhile, for ensuring the ensemble model fit data better, Step Iteration Algorithm is designed to find an appropriate learning rate before models being integrated. Contrast trials are carried out on four time series data sets. Experimental results demonstrate that with the number of integration increasing, the performance of three types of EGB-RNN models tend to converge and the best EGB-RNN model and the best degree of ensemble vary with data sets. It is also shown in statistical results that the designed EGB-RNN models perform better than six baselines.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3082519