Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation

@article{Delaunoy2022TowardsRS,
  title={Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation},
  author={Arnaud Delaunoy and Joeri Hermans and Franccois Rozet and Antoine Wehenkel and Gilles Louppe},
  journal={ArXiv},
  year={2022},
  volume={abs/2208.13624}
}
Modern approaches for simulation-based inference rely upon deep learning surrogates to enable approximate inference with computer simulators. In practice, the estimated posteriors’ computational faithfulness is, however, rarely guaranteed. For example, Hermans et al. [1] show that current simulation-based inference algorithms can produce posteriors that are overconfident, hence risking false inferences. In this work, we introduce Balanced Neural Ratio Estimation ( BNRE ), a variation of the NRE… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 31 REFERENCES

Truncated Marginal Neural Ratio Estimation

This work presents a neural simulator-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability, which is unique among modern algorithms.

Benchmarking Simulation-Based Inference

This work provides a benchmark with inference tasks and suitable performance metrics for ‘likelihood-free’ inference algorithms, with an initial selection of algorithms including recent approaches employing neural networks and classical Approximate Bayesian Computation methods.

Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

Likelihood-free MCMC with Amortized Approximate Ratio Estimators

It is demonstrated that the learned ratio estimator can be embedded in MCMC samplers to approximate likelihood-ratios between consecutive states in the Markov chain, allowing us to draw samples from the intractable posterior.

Robust Bayesian Inference for Simulator-based Models via the MMD Posterior Bootstrap

This paper proposes a novel algorithm based on the posterior bootstrap and maximum mean discrepancy estimators that leads to a highly-parallelisable Bayesian inference algorithm with strong robustness properties for simulators.

Robust Bayesian Inference via Coarsening

This work introduces a novel approach to Bayesian inference that improves robustness to small departures from the model: rather than conditioning on the event that the observed data are generated by the model, one conditions on theevent that the model generates data close to the observedData, in a distributional sense.

Automatic Posterior Transformation for Likelihood-Free Inference

Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.

Score Matched Neural Exponential Families for Likelihood-Free Inference

This work introduces a new way to learn ABC statistics: first, parameter-simulation pairs from the model independently on the observation are generated; then, Score Matching is used to train a neural conditional exponential family to approximate the likelihood, the largest class of distributions with fixed-size sufficient statistics.

Likelihood-Free Frequentist Inference: Bridging Classical Statistics and Machine Learning in Simulation and Uncertainty Quantification

This paper presents a statistical framework for LFI that unifies classical statistics with modern machine learning to construct frequentist confidence sets and hypothesis tests with finite-sample guarantees of nominal coverage and rigorous diagnostics for assessing empirical coverage over the entire parameter space.

Inference from Iterative Simulation Using Multiple Sequences

The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.