Approximate Bayesian Computation Via the Energy Statistic

@article{Nguyen2020ApproximateBC,
  title={Approximate Bayesian Computation Via the Energy Statistic},
  author={Hien Duy Nguyen and Julyan Arbel and Hongliang L{\"u} and Florence Forbes},
  journal={IEEE Access},
  year={2020},
  volume={8},
  pages={131683-131698}
}
Approximate Bayesian computation (ABC) has become an essential part of the Bayesian toolbox for addressing problems in which the likelihood is prohibitively expensive or entirely unknown, making it intractable. ABC defines a pseudo-posterior by comparing observed data with simulated data, traditionally based on some summary statistics, the elicitation of which is regarded as a key difficulty. Recently, using data discrepancy measures has been proposed in order to bypass the construction of… 

Figures and Tables from this paper

Approximate Bayesian computation with surrogate posteriors
TLDR
This work introduces a preliminary learning step in which surrogate posteriors are built from finite Gaussian mixtures using an inverse regression approach and the resulting ABC quasi-posterior distribution is shown to converge to the true one, under standard conditions.
Score Matched Neural Exponential Families for Likelihood-Free Inference
TLDR
This work introduces a new way to learn ABC statistics: first, parameter-simulation pairs from the model independently on the observation are generated; then, Score Matching is used to train a neural conditional exponential family to approximate the likelihood, the largest class of distributions with fixed-size sufficient statistics.
Discrepancy-based Inference for Intractable Generative Models using Quasi-Monte Carlo
TLDR
The key results are sample complexity bounds which demonstrate that, under smoothness conditions on the generator, QMC can significantly reduce the number of samples required to obtain a given level of accuracy when using three of the most common discrepancies: the maximum mean discrepancy, the Wasserstein distance, and the Sinkhorn divergence.
On the mathematical axiomatization of approximate Bayesian computation. A robust set for estimating mechanistic network models through optimal transport
TLDR
The idea is to simulate sets of synthetic data from the model with respect to assigned parameters and, rather than comparing prospects of these data with the corresponding observed values as typically ABC requires, to employ just a distance between a chosen distribution associated to the synthetic data and another of the observed values.
A Comparison of Likelihood-Free Methods With and Without Summary Statistics
TLDR
This article provides a review of full data distance based likelihood-free methods, and conducts the first comprehensive comparison of such methods, both qualitatively and empirically.
Approximating Bayes in the 21st Century
TLDR
The aim is to help new researchers in particular – and more generally those interested in adopting a Bayesian approach to empirical work – distinguish between different approximate techniques; understand the sense in which they are approximate; appreciate when and why particular methods are useful; and see the ways inWhich they can can be combined.
Probabilistic Forecasting with Conditional Generative Networks via Scoring Rule Minimization
TLDR
This manuscript performs probabilistic forecasting with conditional generative networks trained to minimize scoring rule values on two chaotic models and a global dataset of weather observations; results are satisfactory and better calibrated than what achieved by GANs.
Approximate Bayesian Inference
This is the Editorial article summarizing the scope of the Special Issue: Approximate Bayesian Inference.
Approximate Bayesian Computation via Classification
TLDR
The theoretical results show that the rate at which ABC posterior distributions concentrate around the true parameter depends on the estimation error of the classifier, and the usefulness of the approach is demonstrated on simulated examples as well as real data in the context of stock volatility estimation.
Generalized Bayesian Likelihood-Free Inference Using Scoring Rules Estimators
TLDR
A framework for Bayesian Likelihood-Free Inference based on Generalized Bayesian Inference using scoring rules (SRs) is proposed and it is proved finite sample posterior consistency and outlier robustness of the authors' posterior for the Kernel and Energy Scores are proved.
...
1
2
...

References

SHOWING 1-10 OF 51 REFERENCES
On the Asymptotic Efficiency of Approximate Bayesian Computation Estimators
TLDR
An iterative importance sampling algorithm is suggested that will be evaluated empirically on a stochastic volatility model and gives conditions on the importance sampling proposal distribution such that the variance of the estimator will be the same order as that of the maximum likelihood estimator based on the summary statistics used.
Approximate Bayesian Computation: A Nonparametric Perspective
Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate
K2-ABC: Approximate Bayesian Computation with Kernel Embeddings
TLDR
This paper proposes a fully nonparametric ABC paradigm which circumvents the need for manually selecting summary statistics, and uses maximum mean discrepancy (MMD) as a dissimilarity measure between the distributions over observed and simulated data.
Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
TLDR
This work adopts a Kullback-Leibler divergence estimator to assess the data discrepancy and achieves a comparable or higher quasiposterior quality, compared to the existing methods using other discrepancy measures.
Approximate Bayesian computation with the Wasserstein distance
TLDR
This work proposes to avoid the use of summaries and the ensuing loss of information by instead using the Wasserstein distance between the empirical distributions of the observed and synthetic data, and generalizes the well‐known approach of using order statistics within approximate Bayesian computation to arbitrary dimensions.
Robust Bayesian Inference via Coarsening
TLDR
This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.
Relevant statistics for Bayesian model choice
TLDR
This work derives necessary and sufficient conditions on summary statistics for the corresponding Bayes factor to be convergent, namely to select the true model asymptotically under the two models.
Likelihood-free Bayesian estimation of multivariate quantile distributions
An Approximate Likelihood Perspective on ABC Methods
TLDR
This article provides a unifying review, general representation, and classification of all ABC methods from the view of approximate likelihood theory, which clarifies how ABC methods can be characterized, related, combined, improved, and applied for future research.
Likelihood-free inference via classification
TLDR
This work finds that classification accuracy can be used to assess the discrepancy between simulated and observed data and the complete arsenal of classification methods becomes thereby available for inference of intractable generative models.
...
1
2
3
4
5
...