Likelihood-Free Inference with Generative Neural Networks via Scoring Rule Minimization
@article{Pacchiardi2022LikelihoodFreeIW, title={Likelihood-Free Inference with Generative Neural Networks via Scoring Rule Minimization}, author={Lorenzo Pacchiardi and Ritabrata Dutta}, journal={ArXiv}, year={2022}, volume={abs/2205.15784} }
Bayesian Likelihood-Free Inference methods yield posterior approximations for simulator models with intractable likelihood. Recently, many works trained neural networks to approximate either the intractable likelihood or the posterior directly. Most proposals use normalizing flows, namely neural networks parametrizing invertible maps used to transform samples from an underlying base measure; the probability density of the transformed samples is then accessible and the normalizing flow can be…
Figures and Tables from this paper
References
SHOWING 1-10 OF 42 REFERENCES
Probabilistic Forecasting with Conditional Generative Networks via Scoring Rule Minimization
- Computer ScienceArXiv
- 2021
This manuscript performs probabilistic forecasting with conditional generative networks trained to minimize scoring rule values on two chaotic models and a global dataset of weather observations; results are satisfactory and better calibrated than what achieved by GANs.
f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization
- Computer ScienceNIPS
- 2016
It is shown that any f-divergence can be used for training generative neural samplers and the benefits of various choices of divergence functions on training complexity and the quality of the obtained generative models are discussed.
Generative Adversarial Nets
- Computer ScienceNIPS
- 2014
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a…
Auto-Encoding Variational Bayes
- Computer ScienceICLR
- 2014
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
GATSBI: Generative Adversarial Training for Simulation-Based Inference
- Computer ScienceArXiv
- 2022
GATSBI opens up opportunities for leveraging advances in GANs to perform Bayesian inference on high-dimensional simulation-based models, and shows how GATSBI can be extended to perform sequential posterior estimation to focus on individual observations.
BayesFlow: Learning Complex Stochastic Models With Invertible Neural Networks
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2022
It is argued that BayesFlow provides a general framework for building amortized Bayesian parameter estimation machines for any forward model from which data can be simulated and is applicable to modeling scenarios where standard inference techniques with handcrafted summary statistics fail.
Automatic Posterior Transformation for Likelihood-Free Inference
- Computer ScienceICML
- 2019
Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.
Flexible statistical inference for mechanistic models of neural dynamics
- Computer ScienceNIPS
- 2017
This work builds on recent advances in ABC by learning a neural network which maps features of the observed data to the posterior distribution over parameters, and learns a Bayesian mixture-density network approximating the posterior over multiple rounds of adaptively chosen simulations.
MMD-Bayes: Robust Bayesian Estimation via Maximum Mean Discrepancy
- Mathematics, Computer ScienceAABI
- 2019
A pseudo-likelihood based on the Maximum Mean Discrepancy, defined via an embedding of probability distributions into a reproducing kernel Hilbert space is built, and it is shown that this MMD-Bayes posterior is consistent and robust to model misspecification.
Bayesian Synthetic Likelihood
- Computer Science, Mathematics
- 2017
The accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach is explored in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions.