# MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

@article{Isacchini2022MINIMALISTMI, title={MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories}, author={Giulio Isacchini and Natanael Spisak and Armita Nourmohammad and Thierry Mora and Aleksandra M. Walczak}, journal={Physical review. E}, year={2022}, volume={105 5-2}, pages={ 055309 } }

Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice. One class of methods uses data simulated with different parameters to infer models of the likelihood-to-evidence ratio, or equivalently the posterior function. Here we frame the inference task as an estimation of an energy function parametrized with an artificial neural network. We present an intuitive approach, named MINIMALIST, in which the optimal model of the…

## References

SHOWING 1-10 OF 47 REFERENCES

### Likelihood-Free Inference by Ratio Estimation

- Computer ScienceBayesian Analysis
- 2021

An alternative inference approach that is as easy to use as synthetic likelihood but not as restricted in its assumptions, and that, in a natural way, enables automatic selection of relevant summary statistic from a large set of candidates is presented.

### Flexible statistical inference for mechanistic models of neural dynamics

- Computer ScienceNIPS
- 2017

This work builds on recent advances in ABC by learning a neural network which maps features of the observed data to the posterior distribution over parameters, and learns a Bayesian mixture-density network approximating the posterior over multiple rounds of adaptively chosen simulations.

### Automatic Posterior Transformation for Likelihood-Free Inference

- Computer ScienceICML
- 2019

Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.

### On Contrastive Learning for Likelihood-free Inference

- Computer Science, MathematicsICML
- 2020

This work shows that two popular likelihood-free approaches to parameter inference in stochastic simulator models can be unified under a general contrastive learning scheme, and clarify how they should be run and compared.

### Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

- Computer ScienceNIPS
- 2016

This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

### Noise-contrastive estimation: A new estimation principle for unnormalized statistical models

- Computer ScienceAISTATS
- 2010

A new estimation principle is presented to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise, using the model log-density function in the regression nonlinearity, which leads to a consistent (convergent) estimator of the parameters.

### Likelihood-free MCMC with Amortized Approximate Ratio Estimators

- Computer ScienceICML
- 2020

It is demonstrated that the learned ratio estimator can be embedded in MCMC samplers to approximate likelihood-ratios between consecutive states in the Markov chain, allowing us to draw samples from the intractable posterior.

### Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2012

The basic idea is to perform nonlinear logistic regression to discriminate between the observed data and some artificially generated noise and it is shown that the new method strikes a competitive trade-off in comparison to other estimation methods for unnormalized models.

### Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

- Computer ScienceAISTATS
- 2019

It is shown that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and diagnostics for assessing calibration, convergence and goodness-of-fit are discussed.

### CCMI : Classifier based Conditional Mutual Information Estimation

- Computer ScienceUAI
- 2019

This paper introduces an estimator for KL-Divergence based on the likelihood ratio by training a classifier to distinguish the observed joint distribution from the product distribution and shows how to construct several CMI estimators using this basic divergence estimator by drawing ideas from conditional generative models.