Corpus ID: 233181908

Approximate Bayesian inference from noisy likelihoods with Gaussian process emulated MCMC

@inproceedings{Jarvenpaa2021ApproximateBI,
  title={Approximate Bayesian inference from noisy likelihoods with Gaussian process emulated MCMC},
  author={Marko Jarvenpaa and Jukka Corander},
  year={2021}
}
We present an efficient approach for doing approximate Bayesian inference when only a limited number of noisy likelihood evaluations can be obtained due to computational constraints, which is becoming increasingly common for applications of complex models. Our main methodological innovation is to model the log-likelihood function using a Gaussian process (GP) in a local fashion and apply this model to emulate the progression that an exact Metropolis-Hastings (MH) algorithm would take if it was… Expand
A survey of Monte Carlo methods for noisy and costly densities with application to reinforcement learning
TLDR
This survey gives an overview of Monte Carlo methodologies using surrogate models, for dealing with densities which are intractable, costly, and/or noisy, and a modular scheme which encompasses the considered methods is presented. Expand

References

SHOWING 1-10 OF 71 REFERENCES
Bayesian Synthetic Likelihood
ABSTRACT Having the ability to work with complex models can be highly beneficial. However, complex models often have intractable likelihoods, so methods that involve evaluation of the likelihoodExpand
The Bayesian choice
TLDR
This paperback edition, a reprint of the 2001 edition, is a graduate-level textbook that introduces Bayesian statistics and decision theory and was awarded the 2004 DeGroot Prize for setting a new standard for modern textbooks dealing with Bayesian methods. Expand
A table of normal integrals
Integrals of functions of the univariate, bivariate, trivariate and multivariate normal densities are given. Both indefinite and definite integrals are included.
Tables for Computing Bivariate Normal Probabilities
Batch simulations and uncertainty quantification in Gaussian process surrogate approximate Bayesian computation
TLDR
B batch-sequential Bayesian experimental design strategies to parallellise the expensive simulations and a numerical method to fully quantify the uncertainty in, for example, ABC posterior moments is proposed. Expand
Variational Bayesian Monte Carlo
TLDR
A novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC), which combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. Expand
On Markov chain Monte Carlo methods for tall data
TLDR
An original subsampling-based approach is proposed which samples from a distribution provably close to the posterior distribution of interest, yet can require less than $O(n)$ data point likelihood evaluations at each iteration for certain statistical models in favourable scenarios. Expand
A general framework for updating belief distributions
TLDR
It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Expand
GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation
TLDR
This work develops two new ABC sampling algorithms that significantly reduce the number of simulations necessary for posterior inference and stores the information obtained from every simulation in a Gaussian process which acts as a surrogate function for the simulated statistics. Expand
Likelihood ridges and multimodality in population growth rate models.
TLDR
Using the theta-Ricker model as a simple but flexible description of density dependence, theory and simulations are applied to show how multimodality and ridges in the likelihood surface can emerge even in the absence of model misspecification or observation error. Expand
...
1
2
3
4
5
...