Pseudo-Marginal Bayesian Inference for Gaussian Processes
@article{Filippone2013PseudoMarginalBI, title={Pseudo-Marginal Bayesian Inference for Gaussian Processes}, author={Maurizio Filippone and Mark A. Girolami}, journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, year={2013}, volume={36}, pages={2214-2226} }
The main challenges that arise when adopting Gaussian process priors in probabilistic modeling are how to carry out exact Bayesian inference and how to account for uncertainty on model parameters when making model-based predictions on out-of-sample data. Using probit regression as an illustrative working example, this paper presents a general and effective methodology based on the pseudo-marginal approach to Markov chain Monte Carlo that efficiently addresses both of these issues. The results…
Figures and Tables from this paper
62 Citations
Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC
- Computer Science2014 22nd International Conference on Pattern Recognition
- 2014
The results empirically demonstrate that compared to importance sampling, annealed importance sampling can reduce the variance of the estimate of the marginal likelihood exponentially in the number of data at a computational cost that scales only polynomially.
On MCMC for variationally sparse Gaussian processes: A pseudo-marginal approach
- Computer Science
- 2021
A pseudomarginal (PM) scheme is proposed that offers asymptotically exact inference as well as computational gains through doubly stochastic estimators for the intractable likelihood and large datasets.
Pseudo-marginal Bayesian inference for Gaussian process latent variable models
- Computer ScienceMachine Learning
- 2021
A Bayesian inference framework for supervised Gaussian process latent variable models is introduced. The framework overcomes the high correlations between latent variables and hyperparameters by…
Scalable Gaussian process inference using variational methods
- Computer Science
- 2017
Various theoretical issues arising from the application of variational inference to the infinite dimensional Gaussian process setting are settled decisively and a new argument for existing approaches to variational regression that settles debate about their applicability is given.
Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations
- Computer ScienceAISTATS
- 2021
This work shows that, by revisiting old model approximations such as the fully-independent training conditionals endowed with powerful sampling-based inference methods, treating both inducing locations and GP hyper-parameters in a Bayesian way can improve performance significantly.
Marginalised Gaussian Processes with Nested Sampling
- Computer ScienceNeurIPS
- 2021
This work presents an alternative learning procedure where the hyperparameters of the kernel function are marginalised using Nested Sampling (NS), a technique that is well suited to sample from complex, multi-modal distributions.
Inference for log Gaussian Cox processes using an approximate marginal posterior
- Computer Science, Mathematics
- 2016
This paper estimates an approximate marginal posterior for parameters of log Gaussian Cox processes and proposes comprehensive model inference strategy, based on a pseudo-marginal Markov chain Monte Carlo approach.
Sparse Gaussian Processes Revisited
- Computer Science
- 2021
This work develops a fully Bayesian approach to scalable gp and deep gp models, and demonstrates its state-of-the-art performance through an extensive experimental campaign across several regression and classification problems.
Approximated Information Analysis in Bayesian Inference
- MathematicsEntropy
- 2015
The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence, and Gibbs sensitivity is explored by using an alternative to the full conditional distribution of the nuisance parameter.
Sparse Gaussian Process Hyperparameters: Optimize or Integrate?
- Computer ScienceArXiv
- 2022
This work proposes an algorithm for sparse Gaussian process regression which leverages MCMC to sample from the hyperparameter posterior within the variational inducing point framework of [Titsias, 2009], and side-steps the need to sample the inducing points, thereby improving sampling efficiency in the Gaussian likelihood case.
References
SHOWING 1-10 OF 58 REFERENCES
Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC
- Computer Science2014 22nd International Conference on Pattern Recognition
- 2014
The results empirically demonstrate that compared to importance sampling, annealed importance sampling can reduce the variance of the estimate of the marginal likelihood exponentially in the number of data at a computational cost that scales only polynomially.
Bayesian Inference for Gaussian Process Classifiers with Annealing and Exact-Approximate MCMC
- Computer Science
- 2013
It is empirically demonstrates that annealed importance sampling reduces the variance of the estimate of the marginal likelihood exponentially in the number of data compared to importance sampling, while the computational cost scales only polynomially.
A comparative evaluation of stochastic-based inference methods for Gaussian process models
- Computer ScienceMachine Learning
- 2013
A number of inference strategies based on Markov chain Monte Carlo methods are presented and rigorously assessed and extensively compared on simulated and real data on the basis of convergence speed, sampling efficiency, and computational cost.
Marginal likelihood estimation via power posteriors
- Mathematics
- 2008
It is shown how the marginal likelihood can be computed via Markov chain Monte Carlo methods on modified posterior distributions for each model, which then allows Bayes factors or posterior model probabilities to be calculated.
On the Fully Bayesian Treatment of Latent Gaussian Models using Stochastic Simulations
- Computer Science
- 2012
Stochastic simulations based on Markov chain Monte Carlo (MCMC) methods for small to moderately sized data sets and for LGMs comprising a set of parameters that prevents the use of quadrature techniques are proposed.
Assessing Approximate Inference for Binary Gaussian Process Classification
- Computer ScienceJ. Mach. Learn. Res.
- 2005
This work reviews and compares Laplace's method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model, and presents a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling.
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
- Computer Science
- 2009
This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.
Regression and Classification Using Gaussian Process Priors
- Computer Science
- 2009
Gaussian processes are in my view the simplest and most obvious way of defining flexible Bayesian regression and classification models, but despite some past usage, they appear to have been rather neglected as a general-purpose technique.
INLA or MCMC? A tutorial and comparative evaluation for spatial prediction in log-Gaussian Cox processes
- Computer Science
- 2012
The results question the notion that the latter technique is both significantly faster and more robust than MCMC in this setting; 100,000 iterations of the MALA algorithm running in 20 min on a desktop PC delivered greater predictive accuracy than the default INLA strategy and gave comparative performance to the full Laplace approximation which ran in 39 min.
Inference from Iterative Simulation Using Multiple Sequences
- Computer Science
- 1992
The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.