Robust Bayesian synthetic likelihood via a semi-parametric approach

@article{An2020RobustBS,
  title={Robust Bayesian synthetic likelihood via a semi-parametric approach},
  author={Ziwen An and David J. Nott and Christopher C. Drovandi},
  journal={Statistics and Computing},
  year={2020},
  volume={30},
  pages={543-557}
}
Bayesian synthetic likelihood (BSL) is now a well-established method for performing approximate Bayesian parameter estimation for simulation-based models that do not possess a tractable likelihood function. BSL approximates an intractable likelihood function of a carefully chosen summary statistic at a parameter value with a multivariate normal distribution. The mean and covariance matrix of this normal distribution are estimated from independent simulations of the model. Due to the parametric… 
Transformations in Semi-Parametric Bayesian Synthetic Likelihood
TLDR
A number of extensions to semiBSL are proposed that significantly improve the versatility and efficiency of BSL algorithms and consider even more flexible estimators of the marginal distributions using transformation kernel density estimation.
Efficient Bayesian Synthetic Likelihood With Whitening Transformations
TLDR
This article proposes whitening BSL (wBSL)—an efficient BSL method that uses approximate whitening transformations to decorrelate the summary statistics at each algorithm iteration, and shows empirically that this can reduce the number of model simulations required to implement BSL by more than an order of magnitude.
2 Bayesian Synthetic Likelihood and Compatibility 2 . 1 Bayesian Synthetic Likelihood Framework
TLDR
To circumvent the issue of incompatibility between the observed and simulated summary statistics, two robust versions of BSL are proposed that can deliver reliable performance regardless of whether or not the assumed DGP can generate simulatedsummary statistics that mimic the behavior of the observed summaries.
Robust Approximate Bayesian Inference With Synthetic Likelihood
TLDR
This work proposes a new BSL approach that can detect the presence of model misspecification, and simultaneously deliver useful inferences even under significant model missespecified, and demonstrates its superior accuracy over standard BSL when the assumed model is misspecified.
Bayesian inference using synthetic likelihood: asymptotics and adjustments
TLDR
It is shown that Bayesian synthetic likelihood is computationally more efficient than approximate Bayesian computation, and behaves similarly to regression-adjusted approximate Bayesesian computation.
On a Variational Approximation based Empirical Likelihood ABC Method
TLDR
This article shows that the target log-posterior can be approximated as a sum of an expected joint log-likelihood and the differential entropy of the data generating density, and proposes an easy-to-use empirical likelihood ABC method.
Approximate Bayesian inference from noisy likelihoods with Gaussian process emulated MCMC
TLDR
The main methodological innovation is to model the log-likelihood function using a Gaussian process in a local fashion and apply this model to emulate the progression that an exact Metropolis-Hastings algorithm would take if it was applicable.
Parallel Gaussian Process Surrogate Bayesian Inference with Noisy Likelihood Evaluations
TLDR
This work frames the inference task as a sequential Bayesian experimental design problem, where the log-likelihood function is modelled with a hierarchical Gaussian process (GP) surrogate model, which is used to efficiently select additional log- likelihood evaluation locations.
Score Matched Conditional Exponential Families for Likelihood-Free Inference∗
TLDR
This work generates parameter-simulation pairs from the model independently on the observation, and uses them to learn a conditional exponential family likelihood approximation, which can be used as summaries in ABC, and applies this method to a challenging model from meteorology.
Sequentially Guided MCMC Proposals for Synthetic Likelihoods and Correlated Synthetic Likelihoods
TLDR
This work introduces an algorithm producing a proposal distribution that is sequentially tuned and made conditional to data, thus it rapidly guides the proposed parameters towards high posterior density regions, and exploits strategies borrowed from the correlated pseudo-marginal MCMC literature to improve the chains mixing in a SL framework.
...
...

References

SHOWING 1-10 OF 40 REFERENCES
Accelerating Bayesian Synthetic Likelihood With the Graphical Lasso
TLDR
The nontrivial issue of tuning parameter choice in the context of BSL is discussed and a graphical lasso is proposed to provide a sparse estimate of the precision matrix that provides significant improvements in computational efficiency whilst maintaining the ability to produce similar posterior distributions to BSL.
Bayesian Synthetic Likelihood
TLDR
The accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach is explored in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions.
Variational Bayes with synthetic likelihood
TLDR
This article develops alternatives to Markov chain Monte Carlo implementations of Bayesian synthetic likelihoods with reduced computational overheads, using stochastic gradient variational inference methods for posterior approximation in the synthetic likelihood context, employing unbiased estimates of the log likelihood.
Likelihood-Free Inference by Ratio Estimation
TLDR
An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.
An Extended Empirical Saddlepoint Approximation for Intractable Likelihoods
TLDR
A novel, more flexible, density estimator is proposed: the Extended Empirical Saddlepoint approximation, which is able to capture large departures from normality, while being scalable to high dimensions, and this in turn leads to more accurate parameter estimates, relative to the Gaussian alternative.
Constructing summary statistics for approximate Bayesian computation: semi‐automatic approximate Bayesian computation
TLDR
This work shows how to construct appropriate summary statistics for ABC in a semi‐automatic manner, and shows that optimal summary statistics are the posterior means of the parameters.
Bootstrapped synthetic likelihood
TLDR
The use of the bag of little bootstraps is investigated as a means for applying this approach to large datasets, yielding to Monte Carlo algorithms that accurately approximate posterior distributions whilst only simulating subsamples of the full data.
Bayesian indirect inference using a parametric auxiliary model
TLDR
A novel framework called Bayesian indirect likelihood (BIL) is created which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.
Approximate Bayesian computation using indirect inference
TLDR
A novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference, which embeds this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning.
...
...