Loading…
Nonparametric Mixtures of Gaussian Processes With Power-Law Behavior
Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, based on a particularly effective method for placing a prior distribution over the space of regression functions. Several researchers have considered postulating mixtures of GPs as a means of dealing...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2012-12, Vol.23 (12), p.1862-1871 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, based on a particularly effective method for placing a prior distribution over the space of regression functions. Several researchers have considered postulating mixtures of GPs as a means of dealing with nonstationary covariance functions, discontinuities, multimodality, and overlapping output signals. In existing works, mixtures of GPs are based on the introduction of a gating function defined over the space of model input variables. This way, each postulated mixture component GP is effectively restricted in a limited subset of the input space. In this paper, we follow a different approach. We consider a fully generative nonparametric Bayesian model with power-law behavior, generating GPs over the whole input space of the learned task. We provide an efficient algorithm for model inference, based on the variational Bayesian framework, and prove its efficacy using benchmark and real-world datasets. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2012.2217986 |