Pseudo-Marginal Bayesian Inference for Gaussian Processes

@article{Filippone2014PseudoMarginalBI,
  title={Pseudo-Marginal Bayesian Inference for Gaussian Processes},
  author={Maurizio Filippone and Mark A. Girolami},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2014},
  volume={36},
  pages={2214-2226}
}
  • M. FilipponeM. Girolami
  • Published 2 October 2013
  • Computer Science
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
The main challenges that arise when adopting Gaussian process priors in probabilistic modeling are how to carry out exact Bayesian inference and how to account for uncertainty on model parameters when making model-based predictions on out-of-sample data. Using probit regression as an illustrative working example, this paper presents a general and effective methodology based on the pseudo-marginal approach to Markov chain Monte Carlo that efficiently addresses both of these issues. The results… 

Figures and Tables from this paper

Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC

  • M. Filippone
  • Computer Science
    2014 22nd International Conference on Pattern Recognition
  • 2014
The results empirically demonstrate that compared to importance sampling, annealed importance sampling can reduce the variance of the estimate of the marginal likelihood exponentially in the number of data at a computational cost that scales only polynomially.

Adaptive multiple importance sampling for Gaussian processes

This paper studies the application of AMIS for GPs in the case of a Gaussian likelihood, and proposes a novel pseudo-marginal-based AMIS algorithm for non-Gaussian likelihoods, where the marginal likelihood is unbiasedly estimated.

On MCMC for variationally sparse Gaussian processes: A pseudo-marginal approach

A pseudomarginal (PM) scheme is proposed that offers asymptotically exact inference as well as computational gains through doubly stochastic estimators for the intractable likelihood and large datasets.

Pseudo-marginal Bayesian inference for Gaussian process latent variable models

A Bayesian inference framework for supervised Gaussian process latent variable models is introduced. The framework overcomes the high correlations between latent variables and hyperparameters by

Scalable Gaussian process inference using variational methods

Various theoretical issues arising from the application of variational inference to the infinite dimensional Gaussian process setting are settled decisively and a new argument for existing approaches to variational regression that settles debate about their applicability is given.

Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations

This work shows that, by revisiting old model approximations such as the fully-independent training conditionals endowed with powerful sampling-based inference methods, treating both inducing locations and GP hyper-parameters in a Bayesian way can improve performance significantly.

Marginalised Gaussian Processes with Nested Sampling

This work presents an alternative learning procedure where the hyperparameters of the kernel function are marginalised using Nested Sampling (NS), a technique that is well suited to sample from complex, multi-modal distributions.

Deep Gaussian Processes for Calibration of Computer Models

This work proposes a novel calibration framework that is easy to implement in development environments featuring automatic dif-ferentiation and exploiting GPU-type hardware and yields a powerful alternative to the state-of-the-art by means of experimental validations on various calibration problems.

Inference for log Gaussian Cox processes using an approximate marginal posterior

This paper estimates an approximate marginal posterior for parameters of log Gaussian Cox processes and proposes comprehensive model inference strategy, based on a pseudo-marginal Markov chain Monte Carlo approach.

Rethinking Sparse Gaussian Processes: Bayesian Approaches to Inducing-Variable Approximations

This work develops a fully Bayesian approach to scalable GP and deep GP models, and shows that treating both inducing locations and GP hyper-parameters in a Bayesian way, by inferring their full posterior, further significantly improves performance.
...

References

SHOWING 1-10 OF 58 REFERENCES

Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC

  • M. Filippone
  • Computer Science
    2014 22nd International Conference on Pattern Recognition
  • 2014
The results empirically demonstrate that compared to importance sampling, annealed importance sampling can reduce the variance of the estimate of the marginal likelihood exponentially in the number of data at a computational cost that scales only polynomially.

Bayesian Inference for Gaussian Process Classifiers with Annealing and Exact-Approximate MCMC

It is empirically demonstrates that annealed importance sampling reduces the variance of the estimate of the marginal likelihood exponentially in the number of data compared to importance sampling, while the computational cost scales only polynomially.

A comparative evaluation of stochastic-based inference methods for Gaussian process models

A number of inference strategies based on Markov chain Monte Carlo methods are presented and rigorously assessed and extensively compared on simulated and real data on the basis of convergence speed, sampling efficiency, and computational cost.

Marginal likelihood estimation via power posteriors

It is shown how the marginal likelihood can be computed via Markov chain Monte Carlo methods on modified posterior distributions for each model, which then allows Bayes factors or posterior model probabilities to be calculated.

On the Fully Bayesian Treatment of Latent Gaussian Models using Stochastic Simulations

Stochastic simulations based on Markov chain Monte Carlo (MCMC) methods for small to moderately sized data sets and for LGMs comprising a set of parameters that prevents the use of quadrature techniques are proposed.

Assessing Approximate Inference for Binary Gaussian Process Classification

This work reviews and compares Laplace's method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model, and presents a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling.

Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.

Regression and Classification Using Gaussian Process Priors

Gaussian processes are in my view the simplest and most obvious way of defining flexible Bayesian regression and classification models, but despite some past usage, they appear to have been rather neglected as a general-purpose technique.

INLA or MCMC? A tutorial and comparative evaluation for spatial prediction in log-Gaussian Cox processes

The results question the notion that the latter technique is both significantly faster and more robust than MCMC in this setting; 100,000 iterations of the MALA algorithm running in 20 min on a desktop PC delivered greater predictive accuracy than the default INLA strategy and gave comparative performance to the full Laplace approximation which ran in 39 min.

Inference from Iterative Simulation Using Multiple Sequences

The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.
...