Loading…
Robust kernel ensemble regression in diversified kernel space with shared parameters
Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model (RKER) in diversified multiple Reproducing Kernel Hilbert Space...
Saved in:
Published in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023, Vol.53 (1), p.1051-1067 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Kernel regression is an effective non-parametric regression method. However, such regression methods have problems in choosing an appropriate kernel and its parameters. In this paper, we propose a robust kernel ensemble regression model (RKER) in diversified multiple Reproducing Kernel Hilbert Spaces (RKHSs). Motivated by multi-view data processing, we consider a kernel representation as one view of data and apply this multi-view modeling idea into the kernel regression scenario. The proposed RKER uses an ensemble idea to combine multiple individual regressors into one, where each kernel regressor is associated with a weight that is learned directly from one view of data without manual intervention. Thus, the problem of selecting kernel and its parameter in traditional kernel regression methods is overcome by finding best kernel combinations in diversified multiple solution spaces. With this multi-view modeling, RKER results in a superior overall regression performance and more robust in parameter selection. Further, we can learn the parameters in multiple RKHSs with individual specific and shared structures. Experimental results on Abalone and FaceBook datasets demonstrate that our proposed RKER model shows best performance among other state-of-the-art regression and ensemble methods, such as Random Forest, Gradient Boosting Regressor and eXtreme Gradient Boosting. |
---|---|
ISSN: | 0924-669X 1573-7497 |
DOI: | 10.1007/s10489-022-03492-6 |