Truncated Marginal Neural Ratio Estimation

@article{Miller2021TruncatedMN,
  title={Truncated Marginal Neural Ratio Estimation},
  author={Benjamin Kurt Miller and Alex Cole and Patrick Forr'e and Gilles Louppe and Christoph Weniger},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.01214}
}
Parametric stochastic simulators are ubiquitous in science, often featuring highdimensional input parameters and/or an intractable likelihood. Performing Bayesian parameter inference in this context can be challenging. We present a neural simulator-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability, which is unique among modern algorithms. Our approach is simulation efficient by simultaneously estimating low-dimensional marginal… Expand
2 Citations
Arbitrary Marginal Neural Ratio Estimation for Simulation-based Inference
TLDR
This work presents a novel method that enables amortized inference over arbitrary subsets of the parameters, without resorting to numerical integration, which makes interpretation of the posterior more convenient. Expand
Neural Conditional Reweighting
There is a growing use of neural network classifiers as unbinned, high-dimensional (and variabledimensional) reweighting functions. To date, the focus has been on marginal reweighting, where a subsetExpand

References

SHOWING 1-10 OF 69 REFERENCES
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
TLDR
This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior. Expand
Automatic Posterior Transformation for Likelihood-Free Inference
TLDR
Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. Expand
Fast likelihood-free cosmology with neural density estimators and active learning
Likelihood-free inference provides a framework for performing rigorous Bayesian inference using only forward simulations, properly accounting for all physical and observational effects that can beExpand
Simulation-efficient marginal posterior estimation with swyft: stop wasting your precious time
TLDR
This work presents algorithms for nested neural likelihood-to-evidence ratio estimation and simulation reuse via an inhomogeneous Poisson point process cache of parameters and corresponding simulations that enable automatic and extremely simulator efficient estimation of marginal and joint posteriors. Expand
Likelihood-Free Inference by Ratio Estimation
TLDR
An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented. Expand
Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models
TLDR
This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude. Expand
Solving high-dimensional parameter inference: marginal posterior densities & Moment Networks
TLDR
This work proposes direct estimation of lower-dimensional marginal distributions, bypassing high-dimensional density estimation or high- dimensional Markov chain Monte Carlo sampling, and constructs a simple hierarchy of fast neural regression models, called Moment Networks, that compute increasing moments of any desired lower- dimensional marginal posterior density. Expand
Flexible statistical inference for mechanistic models of neural dynamics
TLDR
This work builds on recent advances in ABC by learning a neural network which maps features of the observed data to the posterior distribution over parameters, and learns a Bayesian mixture-density network approximating the posterior over multiple rounds of adaptively chosen simulations. Expand
Likelihood-free inference with emulator networks
TLDR
This work presents a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data -- both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data. Expand
Hierarchical Implicit Models and Likelihood-Free Variational Inference
TLDR
HIMs are introduced, which combine the idea of implicit densities with hierarchical Bayesian modeling, thereby defining models via simulators of data with rich hidden structure and likelihood-free variational inference (LFVI), a scalable Variational inference algorithm for HIMs. Expand
...
1
2
3
4
5
...