• Corpus ID: 251800304

Adversarial Bayesian Simulation

@inproceedings{Wang2022AdversarialBS,
  title={Adversarial Bayesian Simulation},
  author={YueXing Wang and Veronika Rovckov'a},
  year={2022}
}
In the absence of explicit or tractable likelihoods, Bayesians often resort to approximate Bayesian computation (ABC) for inference. Our work bridges ABC with deep neural implicit samplers based on generative adversarial networks (GANs) and adversarial variational Bayes. Both ABC and GANs compare aspects of observed and fake data to simulate from posteriors and likelihoods, respectively. We develop a Bayesian GAN (B-GAN) sampler that directly targets the posterior by solving an adversarial… 

References

SHOWING 1-10 OF 33 REFERENCES

GATSBI: Generative Adversarial Training for Simulation-Based Inference

GATSBI opens up opportunities for leveraging advances in GANs to perform Bayesian inference on high-dimensional simulation-based models, and shows how GATSBI can be extended to perform sequential posterior estimation to focus on individual observations.

Metropolis-Hastings via Classification

This paper develops a Bayesian computational platform at the interface between posterior sampling and optimization in models whose marginal likelihoods are difficult to evaluate, and reframe the likelihood function estimation problem as a classification problem.

Adversarial Variational Bayes: Unifying Variational Autoencoders and Generative Adversarial Networks

Adversarial Variational Bayes (AVB), a technique for training Variational Autoencoders with arbitrarily expressive inference models by introducing an auxiliary discriminative network that allows to rephrase the maximum-likelihood-problem as a two-player game, hence establishing a principled connection between VAEs and Generative Adversarial Networks (GANs).

How Well Generative Adversarial Networks Learn Distributions

A new notion of regularization is discovered, called the generator-discriminator-pair regularization, that sheds light on the advantage of GANs compared to classical parametric and nonparametric approaches for explicit distribution estimation.

Hierarchical Implicit Models and Likelihood-Free Variational Inference

HIMs are introduced, which combine the idea of implicit densities with hierarchical Bayesian modeling, thereby defining models via simulators of data with rich hidden structure and likelihood-free variational inference (LFVI), a scalable Variational inference algorithm for HIMs.

A Deep Generative Approach to Conditional Sampling

A deep generative approach to sampling from a conditional distribution based on a unified formulation of conditional distribution and generalized nonparametric regression function using the noise-outsourcing lemma that outperforms several existing conditional density estimation methods.

Approximate Bayesian Computation via Classification

The theoretical results show that the rate at which ABC posterior distributions concentrate around the true parameter depends on the estimation error of the classifier, and the usefulness of the approach is demonstrated on simulated examples as well as real data in the context of stock volatility estimation.

Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation

This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.

Batch simulations and uncertainty quantification in Gaussian process surrogate approximate Bayesian computation

B batch-sequential Bayesian experimental design strategies to parallellise the expensive simulations and a numerical method to fully quantify the uncertainty in, for example, ABC posterior moments is proposed.

Likelihood-free inference with emulator networks

This work presents a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data -- both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data.