Corpus ID: 203656839

Likelihood-free MCMC with Amortized Approximate Likelihood Ratios

  title={Likelihood-free MCMC with Amortized Approximate Likelihood Ratios},
  author={J. Hermans and Volodimir Begy and Gilles Louppe},
  journal={arXiv: Machine Learning},
Posterior inference with an intractable likelihood is becoming an increasingly common task in scientific domains which rely on sophisticated computer simulations. Typically, these mechanistic models do not admit tractable densities forcing practitioners to rely on approximations during inference. This work proposes a novel approach to address the intractability of the likelihood and the marginal model. We achieve this by learning a flexible estimator which approximates the likelihood-to… Expand
Generalised Bayes Updates with $f$-divergences through Probabilistic Classifiers
This work considers the behavior of generalized belief updates for various specific choices under the $f$-divergence family and shows that for specific divergence functions such an approach can even improve on methods evaluating the correct model likelihood function analytically. Expand
Confidence Sets and Hypothesis Testing in a Likelihood-Free Inference Setting
A frequentist approach to LFI is presented that first formulates the classical likelihood ratio test (LRT) as a parametrized classification problem, and then uses the equivalence of tests and confidence sets to build confidence regions for parameters of interest. Expand
PyAutoLens: Open-Source Strong Gravitational Lensing
This poster presents a poster presenting a poster presented at the 2016 International Congress of the Association for the Advancement of Space Astronomy and Technology (IASTA) in Vienna, Austria, presenting a probabilistic simulation of the response of the Sun to gravitational lensing. Expand


Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior. Expand
Variational Bayes with synthetic likelihood
This article develops alternatives to Markov chain Monte Carlo implementations of Bayesian synthetic likelihoods with reduced computational overheads, using stochastic gradient variational inference methods for posterior approximation in the synthetic likelihood context, employing unbiased estimates of the log likelihood. Expand
Efficient Approximate Bayesian Computation Coupled With Markov Chain Monte Carlo Without Likelihood
The principal idea is to relax the tolerance within MCMC to permit good mixing, but retain a good approximation to the posterior by a combination of subsampling the output and regression adjustment, which will realize substantial computational advances over standard ABC. Expand
Likelihood-free inference via classification
This work finds that classification accuracy can be used to assess the discrepancy between simulated and observed data and the complete arsenal of classification methods becomes thereby available for inference of intractable generative models. Expand
Automatic Posterior Transformation for Likelihood-Free Inference
Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. Expand
Dynamic Likelihood-free Inference via Ratio Estimation (DIRE)
It is shown that convolutional neural networks trained to predict the input parameters from the data provide suitable summary statistics for LFIRE, and on a wide range of time-series models, a single neural network architecture produced equally or more accurate posteriors than alternative methods. Expand
Mining gold from implicit models to improve likelihood-free inference
Inference techniques for this case are presented that combine the insight that additional latent information can be extracted from the simulator with the power of neural networks in regression and density estimation tasks, leading to better sample efficiency and quality of inference. Expand
Recurrent machines for likelihood-free inference
This work designs a recurrent inference machine that learns a sequence of parameter updates leading to good parameter estimates, without ever specifying some explicit notion of divergence between the simulated data and the real data distributions. Expand
GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation
This work develops two new ABC sampling algorithms that significantly reduce the number of simulations necessary for posterior inference and stores the information obtained from every simulation in a Gaussian process which acts as a surrogate function for the simulated statistics. Expand
Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models
This paper proposes a strategy which combines probabilistic modeling of the discrepancy with optimization to facilitate likelihood-free inference and is shown to accelerate the inference through a reduction in the number of required simulations by several orders of magnitude. Expand