Efficient Acquisition Rules for Model-Based Approximate Bayesian Computation

@article{Jarvenpaa2019EfficientAR,
  title={Efficient Acquisition Rules for Model-Based Approximate Bayesian Computation},
  author={Marko Jarvenpaa and Michael U. Gutmann and Arijus Pleska and Aki Vehtari and Pekka Marttinen},
  journal={Bayesian Analysis},
  year={2019}
}
Approximate Bayesian computation (ABC) is a method for Bayesian inference when the likelihood is unavailable but simulating from the model is possible. However, many ABC algorithms require a large number of simulations, which can be costly. To reduce the computational cost, Bayesian optimisation (BO) and surrogate models such as Gaussian processes have been proposed. Bayesian optimisation enables one to intelligently decide where to evaluate the model next but common BO strategies are not… 

Figures and Tables from this paper

Parallel Gaussian process surrogate method to accelerate likelihood-free inference
TLDR
Motivated by recent progress in batch Bayesian optimisation, various batch-sequential strategies where multiple simulations are adaptively selected to minimise either the expected or median loss function measuring the uncertainty in the resulting posterior are developed.
No-regret approximate inference via Bayesian optimisation
TLDR
An upper confidence bound (UCB) algorithm is derived to propose non-parametric distribution candidates and achieves asymptotically no regret in Bayesian inference problems where the likelihood function is either expensive to evaluate or only available via noisy estimates.
Flexible And Efficient Simulation-Based Inference For Models Of Decision-Making
TLDR
Mixed Neural Likelihood Estimation (MNLE), trains neural density estimators on model simulations to emulate the simulator, which can be used to perform Bayesian parameter inference on experimental data using standard approximate inference methods like Markov Chain Monte Carlo sampling.
Likelihood-free inference with emulator networks
TLDR
This work presents a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data -- both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data.
Measuring the accuracy of likelihood-free inference
TLDR
This work argues for scoring algorithms by the mean squared error in estimating expectations of functions with respect to the posterior, and shows sequential Monte Carlo in this context can be made more accurate with no new samples by accepting particles from all rounds.
Likelihood-Free Inference by Ratio Estimation
TLDR
An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.
A review of approximate Bayesian computation methods via density estimation: Inference for simulator‐models
TLDR
Advantages and limitations of models based on parametric approaches are shown and developments in machine learning are drawn attention, which are believed to have the potential to make ABC scalable to higher dimensions and may be the future direction for research in this area.
Automatic Posterior Transformation for Likelihood-Free Inference
TLDR
Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.
Adaptive MCMC for synthetic likelihoods and correlated synthetic likelihoods.
TLDR
A novel adaptive MCMC algorithm for SL is introduced where the proposal distribution is sequentially tuned and existing strategies from the correlated particle filters literature are exploited, to improve the MCMC mixing in a SL framework.
Bayesian optimization for likelihood-free cosmological inference
TLDR
This work addresses the problem of performing likelihood-free Bayesian inference from black-box simulation-based models, under the constraint of a very limited simulation budget, and adopts an approach based on the likelihood of an alternative parametric model.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 67 REFERENCES
Approximate Bayesian computation via regression density estimation
TLDR
The contribution of the present paper is to consider regression density estimation techniques to approximate the likelihood in the ABC setting, which builds on recently developed marginal adaptation density estimators by extending them for conditional density estimation.
Approximate Bayesian computation (ABC) gives exact results under the assumption of model error
  • R. Wilkinson
  • Computer Science
    Statistical applications in genetics and molecular biology
  • 2013
TLDR
Under the assumption of the existence of a uniform additive model error term, ABC algorithms give exact results when sufficient summaries are used, which allows the approximation made in many previous application papers to be understood, and should guide the choice of metric and tolerance in future work.
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
TLDR
This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.
Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems
TLDR
This paper discusses and applies an ABC method based on sequential Monte Carlo (SMC) to estimate parameters of dynamical models and develops ABC SMC as a tool for model selection; given a range of different mathematical descriptions, it is able to choose the best model using the standard Bayesian model selection apparatus.
Constructing summary statistics for approximate Bayesian computation: semi‐automatic approximate Bayesian computation
TLDR
This work shows how to construct appropriate summary statistics for ABC in a semi‐automatic manner, and shows that optimal summary statistics are the posterior means of the parameters.
Accelerating ABC methods using Gaussian processes
TLDR
Gaussian process (GP) accelerated ABC is introduced, which it is shown can significantly reduce the number of simulations required and enable more accurate inference in some models.
GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation
TLDR
This work develops two new ABC sampling algorithms that significantly reduce the number of simulations necessary for posterior inference and stores the information obtained from every simulation in a Gaussian process which acts as a surrogate function for the simulated statistics.
Gaussian Processes to Speed up Hybrid Monte Carlo for Expensive Bayesian Integrals
TLDR
This work proposes to use a Gaussian Process model of the (log of the) posterior for most of the computations required by HMC, allowing Bayesian treatment of models with posteriors that are computationally demanding, such as models involving computer simulation.
Bayesian Synthetic Likelihood
TLDR
The accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach is explored in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions.
...
1
2
3
4
5
...