Loading…
Evolutionary extreme learning machine ensembles with size control
Ensemble learning aims to improve the generalization power and the reliability of learner models through sampling and optimization techniques. It has been shown that an ensemble constructed by a selective collection of base learners outperforms favorably. However, effective implementation of such an...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2013-02, Vol.102, p.98-110 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Ensemble learning aims to improve the generalization power and the reliability of learner models through sampling and optimization techniques. It has been shown that an ensemble constructed by a selective collection of base learners outperforms favorably. However, effective implementation of such an ensemble from a given learner pool is still an open problem. This paper presents an evolutionary approach for constituting extreme learning machine (ELM) ensembles. Our proposed algorithm employs the model diversity as fitness function to direct the selection of base learners, and produces an optimal solution with ensemble size control. A comprehensive comparison is carried out, where the basic ELM is used to generate a set of neural networks and 12 benchmarked regression datasets are employed in simulations. Our reporting results demonstrate that the proposed method outperforms other ensembling techniques, including simple average, bagging and adaboost, in terms of both effectiveness and efficiency. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2011.12.046 |