• Publications
  • Influence
Gaussian Processes for Machine Learning
TLDR
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. Expand
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
TLDR
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. Expand
The Infinite Gaussian Mixture Model
  • C. Rasmussen
  • Mathematics, Computer Science
  • NIPS
  • 29 November 1999
TLDR
In a Bayesian mixture model it is not necessary a priori to limit the number of components to be finite. Expand
A Unifying View of Sparse Approximate Gaussian Process Regression
TLDR
We provide a new unifying view, including all existing proper probabilistic sparse approximations for Gaussian process regression. Expand
PILCO: A Model-Based and Data-Efficient Approach to Policy Search
TLDR
We introduce PILCO, a practical, data-efficient model-based policy search method for learning from scratch in only a few trials. Expand
Gaussian Processes for Regression
TLDR
We investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Expand
The Infinite Hidden Markov Model
TLDR
We show that it is possible to extend hidden Markov models to have a countably infinite number of hidden states in a finite sequence by using the theory of Dirichlet processes. Expand
Gaussian Processes for Data-Efficient Learning in Robotics and Control
TLDR
We learn a probabilistic, non-parametric Gaussian process transition model of the system. Expand
Evaluation of gaussian processes and other methods for non-linear regression
TLDR
This thesis develops two Bayesian learning methods relying on Gaussian processes and a rigorous statistical approach for evaluating such methods. Expand
Sparse Spectrum Gaussian Process Regression
TLDR
We present a new sparse Gaussian Process (GP) model for regression that retains the computational efficiency of the aforementioned approaches, while improving performance. Expand
...
1
2
3
4
5
...