Skip to main content Skip to main navigation

Publication

Stacked Gaussian Process Learning

Marion Neumann; Kristian Kersting; Zhao Xu; Daniel Schulz
In: Wei Wang; Hillol Kargupta; Sanjay Ranka; Philip S. Yu; Xindong Wu (Hrsg.). ICDM 2009, The Ninth IEEE International Conference on Data Mining. IEEE International Conference on Data Mining (ICDM-2009), December 6-9, Miami, Florida, USA, Pages 387-396, IEEE Computer Society, 2009.

Abstract

Triggered by a market relevant application that involves making joint predictions of pedestrian and public transit flows in urban areas, we address the question of how to utilize hidden common cause relations among variables of interest in order to improve performance in the two related regression tasks. Specifically, we propose stacked Gaussian process learning, a meta-learning scheme in which a base Gaussian process is enhanced by adding the posterior covariance functions of other related tasks to its covariance function in a stage-wise optimization. The idea is that the stacked posterior covariances encode the hidden common causes among variables of interest that are shared across the related regression tasks. Stacked Gaussian process learning is efficient, capable of capturing shared common causes, and can be implemented with any kind of standard Gaussian process regression model such as sparse approximations and relational variants. Our experimental results on real-world data from the market relevant application show that stacked Gaussian processes learning can significantly improve prediction performance of a standard Gaussian process.

More links