Robust Approximate Bayesian Inference With Synthetic Likelihood

@article{Frazier2021RobustAB,
  title={Robust Approximate Bayesian Inference With Synthetic Likelihood},
  author={David T. Frazier and Christopher C. Drovandi},
  journal={Journal of Computational and Graphical Statistics},
  year={2021},
  volume={30},
  pages={958 - 976}
}
Abstract Bayesian synthetic likelihood (BSL) is now an established method for conducting approximate Bayesian inference in models where, due to the intractability of the likelihood function, exact Bayesian approaches are either infeasible or computationally too demanding. Implicit in the application of BSL is the assumption that the data-generating process (DGP) can produce simulated summary statistics that capture the behaviour of the observed summary statistics. We demonstrate that if this… 
Synthetic Likelihood in Misspecified Models: Consequences and Corrections
TLDR
Theoretical results demonstrate that in misspecified models the BSL posterior can display a wide range of behaviours depending on the level of model misspecification, including being asymptotically non-Gaussian.
BSL: An R Package for Efficient Parameter Estimation for Simulation-Based Models via Bayesian Synthetic Likelihood
TLDR
An R package called BSL is presented that amalgamates the aforementioned methods and more into a single, easy-to-use and coherent piece of software.
Bayesian inference using synthetic likelihood: asymptotics and adjustments
TLDR
It is shown that Bayesian synthetic likelihood is computationally more efficient than approximate Bayesian computation, and behaves similarly to regression-adjusted approximate Bayesesian computation.
Efficient Bayesian Synthetic Likelihood With Whitening Transformations
TLDR
This article proposes whitening BSL (wBSL)—an efficient BSL method that uses approximate whitening transformations to decorrelate the summary statistics at each algorithm iteration, and shows empirically that this can reduce the number of model simulations required to implement BSL by more than an order of magnitude.
Transformations in Semi-Parametric Bayesian Synthetic Likelihood
TLDR
A number of extensions to semiBSL are proposed that significantly improve the versatility and efficiency of BSL algorithms and consider even more flexible estimators of the marginal distributions using transformation kernel density estimation.
Detecting Model Misspecification in Amortized Bayesian Inference with Neural Networks
TLDR
An augmented optimization objective is proposed which imposes a probabilistic structure on the learned latent data summary space and utilize maximum mean discrepancy (MMD) to detect potentially catastrophic misspecifications during inference undermining the validity of the obtained results.
Detecting conflicting summary statistics in likelihood-free inference
TLDR
Using a recent idea from the interpretable machine learning literature, some regression-based diagnostic methods are developed which are useful for detecting when different parts of a summary statistic vector contain conflicting information about the model parameters.
Robust Approximate Bayesian Computation: An Adjustment Approach
TLDR
A novel approach to approximate Bayesian computation (ABC) that seeks to cater for possible misspecification of the assumed model and mitigates the poor performance of regression adjusted ABC that can eventuate when the model is misspecified.
Modularized Bayesian analyses and cutting feedback in likelihood-free inference
TLDR
A semi-modular approach to likelihood-free inference where feedback is partially cut based on Gaussian mixture approximations to the joint distribution of parameters and data summary statistics is developed.
On a Variational Approximation based Empirical Likelihood ABC Method
TLDR
This article shows that the target log-posterior can be approximated as a sum of an expected joint log-likelihood and the differential entropy of the data generating density, and proposes an easy-to-use empirical likelihood ABC method.
...
...

References

SHOWING 1-10 OF 45 REFERENCES
Robust Bayesian synthetic likelihood via a semi-parametric approach
TLDR
A semi-parametric approach is developed to relax the parametric assumption implicit in BSL to an extent and maintain the computational advantages of BSL without any additional tuning and can be significantly more robust than BSL and another approach in the literature.
Bayesian inference using synthetic likelihood: asymptotics and adjustments
TLDR
It is shown that Bayesian synthetic likelihood is computationally more efficient than approximate Bayesian computation, and behaves similarly to regression-adjusted approximate Bayesesian computation.
Bayesian Synthetic Likelihood
TLDR
The accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach is explored in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions.
Model misspecification in approximate Bayesian computation: consequences and diagnostics
TLDR
The theoretical results demonstrate that even though the model is misspecified, under regularity conditions, the accept–reject ABC approach concentrates posterior mass on an appropriately defined pseudotrue parameter value, but under model misspecification the ABC posterior does not yield credible sets with valid frequentist coverage and has non‐standard asymptotic behaviour.
Model Misspecification in ABC: Consequences and Diagnostics
TLDR
The theoretical results demonstrate that even though the model is misspecified, under regularity conditions, the accept/reject ABC approach concentrates posterior mass on an appropriately defined pseudo-true parameter value.
Approximate Bayesian Computation: A Nonparametric Perspective
Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate
Robust Approximate Bayesian Computation: An Adjustment Approach
TLDR
A novel approach to approximate Bayesian computation (ABC) that seeks to cater for possible misspecification of the assumed model and mitigates the poor performance of regression adjusted ABC that can eventuate when the model is misspecified.
Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models
TLDR
This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.
Relevant statistics for Bayesian model choice
TLDR
This work derives necessary and sufficient conditions on summary statistics for the corresponding Bayes factor to be convergent, namely to select the true model asymptotically under the two models.
A general framework for updating belief distributions
TLDR
It is argued that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case.
...
...