Loading…

Concept Drift Adaptation by Exploiting Historical Knowledge

Incremental learning with concept drift has often been tackled by ensemble methods, where models built in the past can be retrained to attain new models for the current data. Two design questions need to be addressed in developing ensemble methods for incremental learning with concept drift, i.e., w...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2018-10, Vol.29 (10), p.4822-4832
Main Authors: Sun, Yu, Tang, Ke, Zhu, Zexuan, Yao, Xin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Incremental learning with concept drift has often been tackled by ensemble methods, where models built in the past can be retrained to attain new models for the current data. Two design questions need to be addressed in developing ensemble methods for incremental learning with concept drift, i.e., which historical (i.e., previously trained) models should be preserved and how to utilize them. A novel ensemble learning method, namely, Diversity and Transfer-based Ensemble Learning (DTEL), is proposed in this paper. Given newly arrived data, DTEL uses each preserved historical model as an initial model and further trains it with the new data via transfer learning. Furthermore, DTEL preserves a diverse set of historical models, rather than a set of historical models that are merely accurate in terms of classification accuracy. Empirical studies on 15 synthetic data streams and 5 real-world data streams (all with concept drifts) demonstrate that DTEL can handle concept drift more effectively than 4 other state-of-the-art methods.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2017.2775225