Loading…
Sparse inverse kernel Gaussian Process regression
Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. Gaussian Process regression (GPR) is a popular technique for modeling the input–output relations of a set of variables under the assumption that the wei...
Saved in:
Published in: | Statistical analysis and data mining 2013-06, Vol.6 (3), p.205-220 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. Gaussian Process regression (GPR) is a popular technique for modeling the input–output relations of a set of variables under the assumption that the weight vector has a Gaussian prior. However, it is challenging to apply GPR to large data sets since prediction based on the learned model requires inversion of an order n kernel matrix. Approximate solutions for sparse Gaussian Processes have been proposed for sparse problems. However, in almost all cases, these solution techniques are agnostic to the input domain and do not preserve the similarity structure in the data. As a result, although these solutions sometimes provide excellent accuracy, the models do not have interpretability. Such interpretable sparsity patterns are very important for many applications. We propose a new technique for sparse GPR that allows us to compute a parsimonious model while preserving the interpretability of the sparsity structure in the data. We discuss how the inverse kernel matrix used in Gaussian Process prediction gives valuable domain information and then adapt the inverse covariance estimation from Gaussian graphical models to estimate the Gaussian kernel. We solve the optimization problem using the alternating direction method of multipliers that is amenable to parallel computation. We compare the performance of this algorithm to different existing methods for sparse covariance regression in terms of both speed and accuracy. We demonstrate the performance of our method in terms of accuracy, scalability, and interpretability on two different satellite data sets from the climate domain. © 2013 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 6: 205–220, 2013 |
---|---|
ISSN: | 1932-1864 1932-1872 |
DOI: | 10.1002/sam.11189 |