Loading…
Classification using hierarchical mixture of discriminative learners: How to achieve high scores with few resources?
•We design a generalized framework for hierarchical mixture of experts.•Several forms of weighting functions can be combined in the same architecture.•A suitable choice of the weight functions will reduce the number of experts used.•This reduction is shown through two examples derived from the gener...
Saved in:
Published in: | Expert systems with applications 2018-04, Vol.96, p.14-24 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We design a generalized framework for hierarchical mixture of experts.•Several forms of weighting functions can be combined in the same architecture.•A suitable choice of the weight functions will reduce the number of experts used.•This reduction is shown through two examples derived from the generalized framework.
In many real-world classification tasks, discriminative models are widely applied since they achieve good predictive scores. In this paper, we propose a generalized framework for the well-studied Hierarchical mixture of experts (HME) model. HME combines hierarchically several discriminative models through a set of input-dependent weights. We derive from our generalized framework, two models as examples of how we can reduce the number of experts used. Those two examples are based on the choice of the weights functions. We choose a Gaussian-based and a linear softmax as weights, restricting our study to a two-level tree. Experiments on synthetic and real-world datasets show that our models can efficiently reduce the number of experts and outperform some state-of-art algorithms. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2017.11.046 |