• Corpus ID: 203656839

Likelihood-free MCMC with Amortized Approximate Likelihood Ratios

  title={Likelihood-free MCMC with Amortized Approximate Likelihood Ratios},
  author={Joeri Hermans and Volodimir Begy and Gilles Louppe},
  journal={arXiv: Machine Learning},
Posterior inference with an intractable likelihood is becoming an increasingly common task in scientific domains which rely on sophisticated computer simulations. Typically, these mechanistic models do not admit tractable densities forcing practitioners to rely on approximations during inference. This work proposes a novel approach to address the intractability of the likelihood and the marginal model. We achieve this by learning a flexible estimator which approximates the likelihood-to… 

Generalised Bayes Updates with $f$-divergences through Probabilistic Classifiers

This work considers the behavior of generalized belief updates for various specific choices under the $f$-divergence family and shows that for specific divergence functions such an approach can even improve on methods evaluating the correct model likelihood function analytically.

Confidence Sets and Hypothesis Testing in a Likelihood-Free Inference Setting

A frequentist approach to LFI is presented that first formulates the classical likelihood ratio test (LRT) as a parametrized classification problem, and then uses the equivalence of tests and confidence sets to build confidence regions for parameters of interest.

Unifying Likelihood-free Inference with Black-box Optimization and Beyond

This work proposes to unify two seemingly distinct worlds: likelihood-free inference and black-box optimization, under one probabilistic framework, and provides a recipe for constructing various sequence design methods based on this framework.

Unifying Likelihood-free Inference with Black-box Sequence Design and Beyond

This work proposes to unify two seemingly distinct worlds: likelihood-free inference and black-box sequence design, under one probabilistic framework and provides a recipe for constructing various sequence design methods based on this framework.

PyAutoLens: Open-Source Strong Gravitational Lensing

This poster presents a poster presenting a poster presented at the 2016 International Congress of the Association for the Advancement of Space Astronomy and Technology (IASTA) in Vienna, Austria, presenting a probabilistic simulation of the response of the Sun to gravitational lensing.



Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

Variational Bayes with synthetic likelihood

This article develops alternatives to Markov chain Monte Carlo implementations of Bayesian synthetic likelihoods with reduced computational overheads, using stochastic gradient variational inference methods for posterior approximation in the synthetic likelihood context, employing unbiased estimates of the log likelihood.

Efficient Approximate Bayesian Computation Coupled With Markov Chain Monte Carlo Without Likelihood

The principal idea is to relax the tolerance within MCMC to permit good mixing, but retain a good approximation to the posterior by a combination of subsampling the output and regression adjustment, which will realize substantial computational advances over standard ABC.

Likelihood-Free Inference by Ratio Estimation

An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.

Likelihood-free inference via classification

This work finds that classification accuracy can be used to assess the discrepancy between simulated and observed data and the complete arsenal of classification methods becomes thereby available for inference of intractable generative models.

Automatic Posterior Transformation for Likelihood-Free Inference

Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.

Mining gold from implicit models to improve likelihood-free inference

Inference techniques for this case are presented that combine the insight that additional latent information can be extracted from the simulator with the power of neural networks in regression and density estimation tasks, leading to better sample efficiency and quality of inference.

Recurrent machines for likelihood-free inference

This work designs a recurrent inference machine that learns a sequence of parameter updates leading to good parameter estimates, without ever specifying some explicit notion of divergence between the simulated data and the real data distributions.

GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation

This work develops two new ABC sampling algorithms that significantly reduce the number of simulations necessary for posterior inference and stores the information obtained from every simulation in a Gaussian process which acts as a surrogate function for the simulated statistics.

Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models

This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude.