# Bayesian Experimental Design for Implicit Models by Mutual Information Neural Estimation

@article{Kleinegesse2020BayesianED, title={Bayesian Experimental Design for Implicit Models by Mutual Information Neural Estimation}, author={Steven Kleinegesse and Michael U Gutmann}, journal={ArXiv}, year={2020}, volume={abs/2002.08129} }

Implicit stochastic models, where the data-generation distribution is intractable but sampling is possible, are ubiquitous in the natural sciences. The models typically have free parameters that need to be inferred from data collected in scientific experiments. A fundamental question is how to design the experiments so that the collected data are most useful. The field of Bayesian experimental design advocates that, ideally, we should choose designs that maximise the mutual information (MI…

## Figures from this paper

## 26 Citations

Tight Mutual Information Estimation With Contrastive Fenchel-Legendre Optimization

- Computer ScienceArXiv
- 2021

This work revisits the mathematics of popular variational MI bounds from the lens of unnormalized statistical modeling and convex optimization, and results in a novel, simple, and powerful contrastive MI estimator, named FLO.

A Scalable Gradient-Free Method for Bayesian Experimental Design with Implicit Models

- Computer ScienceAISTATS
- 2021

This paper proposes a novel approach that leverages recent advances in stochastic approximate gradient ascent incorporated with a smoothed variational MI estimator for efficient and robust BED.

A Hybrid Gradient Method to Designing Bayesian Experiments for Implicit Models

- Computer ScienceArXiv
- 2021

This work proposes a hybrid gradient approach that leverages recent advances in variational MI estimator and evolution strategies combined with black-box stochastic gradient ascent (SGA) to maximize the MI lower bound.

Bayesian Optimal Experimental Design for Simulator Models of Cognition

- Computer ScienceArXiv
- 2021

This work combines recent advances in BOED and approximate inference for intractable models, using machine-learning methods to create optimal experimental designs, approximate sufﬁcient summary statistics and amortized posterior distributions.

Efficient Real-world Testing of Causal Decision Making via Bayesian Experimental Design for Contextual Optimisation

- Computer ScienceArXiv
- 2022

A model-agnostic framework for gathering data to evaluate and improve contextual decision making through Bayesian Experimental Design, which achieves this by introducing an information-based design objective, which is optimised end-to-end.

Bayesian Experimental Design Without Posterior Calculations: An Adversarial Approach

- Computer ScienceBayesian Analysis
- 2022

An efficient alternative approach without posterior calculations, based on optimising the expected trace of the Fisher information, is introduced, which can be used with gradient based optimisation methods to find designs efficiently in practice.

An Optimal Likelihood Free Method for Biological Model Selection

- Computer Science, Biology
- 2022

This work presents an algorithm for automated biological model selection using mathematical models of systems biology and likelihood free inference methods and shows improved performance in arriving at correct models without a priori information over conventional heuristics used in experimental biology and random search.

Robust Expected Information Gain for Optimal Bayesian Experimental Design Using Ambiguity Sets

- Computer Science
- 2022

The ranking of experiments by expected information gain (EIG) in Bayesian experimental design is sensitive to changes in the model’s prior distribution, and the approximation of EIG yielded by…

Statistical applications of contrastive learning

- Computer ScienceBehaviormetrika
- 2022

An introduction to contrastive learning is provided and how it can be used to derive methods for diverse statistical problems, namely parameter estimation for energy-based models, Bayesian inference for simulator- based models, as well as experimental design.

Policy-Based Bayesian Experimental Design for Non-Differentiable Implicit Models

- Computer ScienceArXiv
- 2022

Reinforcement Learning for Deep Adaptive Design (RL-DAD), a method for simulation-based optimal experimental design for non-differentiable implicit models, is introduced and it is found that it performs competitively with baselines on three benchmarks.

## References

SHOWING 1-10 OF 43 REFERENCES

Towards Bayesian experimental design for nonlinear models that require a large number of sampling times

- Computer ScienceComput. Stat. Data Anal.
- 2014

Monte Carlo Gradient Estimation in Machine Learning

- Computer ScienceJ. Mach. Learn. Res.
- 2020

A broad and accessible survey of the methods for Monte Carlo gradient estimation in machine learning and across the statistical sciences, exploring three strategies--the pathwise, score function, and measure-valued gradient estimators--exploring their historical developments, derivation, and underlying assumptions.

Sequential Bayesian Experimental Design for Implicit Models via Mutual Information

- Computer ScienceArXiv
- 2020

This work devises a novel sequential design framework for parameter estimation that uses the Mutual Information between model parameters and simulated data as a utility function to find optimal experimental designs, which has not been done before for implicit models.

A Unified Stochastic Gradient Approach to Designing Bayesian-Optimal Experiments

- Computer ScienceAISTATS
- 2020

We introduce a fully stochastic gradient based approach to Bayesian optimal experimental design (BOED). Our approach utilizes variational lower bounds on the expected information gain (EIG) of an…

On Variational Bounds of Mutual Information

- Computer ScienceICML
- 2019

This work introduces a continuum of lower bounds that encompasses previous bounds and flexibly trades off bias and variance and demonstrates the effectiveness of these new bounds for estimation and representation learning.

On the efficient determination of optimal Bayesian experimental designs using ABC: A case study in optimal observation of epidemics

- Computer Science
- 2016

Variational Bayesian Optimal Experimental Design

- Computer ScienceNeurIPS
- 2019

This work introduces several classes of fast EIG estimators by building on ideas from amortized variational inference, and shows theoretically and empirically that these estimators can provide significant gains in speed and accuracy over previous approaches.

Adaptive Gaussian Copula ABC

- Computer ScienceAISTATS
- 2019

This work presents a simple yet effective ABC algorithm based on the combination of two classical ABC approaches --- regression ABC and sequential ABC that first target another auxiliary distribution that can be learned accurately by existing methods, through which the desired posterior is learned with the help of a Gaussian copula.

Likelihood-Free Extensions for Bayesian Sequentially Designed Experiments

- Mathematics, Computer Science
- 2016

In this work, likelihood-free extensions of the standard SMC algorithm are proposed and a specific simulation-based approximation of the likelihood known as the synthetic likelihood is investigated.