# Bayesian active learning for posterior estimation

@inproceedings{Kandasamy2015BayesianAL, title={Bayesian active learning for posterior estimation}, author={Kirthevasan Kandasamy and Jeff G. Schneider and Barnab{\'a}s P{\'o}czos}, year={2015} }

This paper studies active posterior estimation in a Bayesian setting when the likelihood is expensive to evaluate. Existing techniques for posterior estimation are based on generating samples representative of the posterior. Such methods do not consider efficiency in terms of likelihood evaluations. In order to be query efficient we treat posterior estimation in an active regression framework. We propose two myopic query strategies to choose where to evaluate the likelihood and implement them…

## 36 Citations

Query efficient posterior estimation in scientific experiments via Bayesian active learning

- Computer ScienceArtif. Intell.
- 2017

Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions

- Computer ScienceNeural Computation
- 2018

A Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work, and provides an adaptive algorithm to construct such an approximation.

Efficient Acquisition Rules for Model-Based Approximate Bayesian Computation

- Computer Science, BusinessBayesian Analysis
- 2019

This paper proposes to compute the uncertainty in the ABC posterior density, which is due to a lack of simulations to estimate this quantity accurately, and defines a loss function that measures this uncertainty and proposes to select the next evaluation location to minimise the expected loss.

Variational Bayesian Monte Carlo

- Computer ScienceNeurIPS
- 2018

A novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC), which combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective.

Variational Bayesian Monte

- Computer Science
- 2018

A novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC), which combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective.

Parallel Gaussian process surrogate method to accelerate likelihood-free inference

- Computer ScienceArXiv
- 2019

Motivated by recent progress in batch Bayesian optimisation, various batch-sequential strategies where multiple simulations are adaptively selected to minimise either the expected or median loss function measuring the uncertainty in the resulting posterior are developed.

Inverse Gaussian Process regression for likelihood-free inference

- Computer Science, Mathematics
- 2021

This work presents a method to compute an approximate of the posterior with a limited number of model simulations of an inverse Gaussian Process regression (IGPR), and provides an adaptive algorithm with a tempering procedure to construct the approximations of the marginal posterior distributions.

Ranking Documents Through Stochastic Sampling on Bayesian Network-based Models: A Pilot Study

- Computer ScienceSIGIR
- 2016

Experimental results suggest that performance of the Bayesian Networks model is at least comparable to the baseline ones such as BM25, and the framework of this model potentially offers new and novel ways in weighting documents.

Factor Screening using Bayesian Active Learning and Gaussian Process Meta-Modelling

- Computer Science2020 25th International Conference on Pattern Recognition (ICPR)
- 2021

In this paper we propose a data-efficient Bayesian active learning framework for factor screening, which is important when dealing with systems which are expensive to evaluate, such as combat…

An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo

- Computer ScienceAABI
- 2018

The performance of VBMC under variations of two key components of the framework is studied, and a new general family of acquisition functions for active sampling is proposed and evaluated, which includes as special cases the acquisition functions used in the original work.

## References

SHOWING 1-10 OF 32 REFERENCES

Sampling for Inference in Probabilistic Models with Fast Bayesian Quadrature

- Computer ScienceNIPS
- 2014

A warped model for probabilistic integrands (likelihoods) that are known to be non-negative are introduced, permitting a cheap active learning scheme to optimally select sample locations.

A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning

- Computer ScienceArXiv
- 2010

A tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions using the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function.

Active Learning of Model Evidence Using Bayesian Quadrature

- Computer ScienceNIPS
- 2012

This work proposes a novel Bayesian Quadrature approach for numerical integration when the integrand is non-negative, such as the case of computing the marginal likelihood, predictive distribution, or normalising constant of a probabilistic model.

Nested sampling for general Bayesian computation

- Mathematics
- 2006

Nested sampling estimates directly how the likelihood function relates to prior mass. The evidence (alternatively the marginal likelihood, marginal den- sity of the data, or the prior predictive) is…

Approximate Bayesian computational methods

- Computer ScienceStat. Comput.
- 2012

In this survey, the various improvements and extensions brought on the original ABC algorithm in recent years are studied.

Active Area Search via Bayesian Quadrature

- Computer ScienceAISTATS
- 2014

This paper presents an algorithm for the case where the positive class is defined in terms of a region’s average function value being above some threshold with high probability, a problem the authors call active area search that outperforms previous algorithms that were developed for other active search goals.

On sequential Monte Carlo, partial rejection control and approximate Bayesian computation

- Mathematics, Computer ScienceStat. Comput.
- 2012

It is proved that the new sampler can reduce the variance of the incremental importance weights when compared with standard sequential Monte Carlo samplers, and provide a central limit theorem.

Markov chain Monte Carlo without likelihoods

- MathematicsProceedings of the National Academy of Sciences of the United States of America
- 2003

A Markov chain Monte Carlo method for generating observations from a posterior distribution without the use of likelihoods is presented, which can be used in frequentist applications, in particular for maximum-likelihood estimation.

Gaussian Process Regression: Active Data Selection and Test Point Rejection

- MathematicsDAGM-Symposium
- 2000

It is found that, for both a two-dimensional toy problem and a real-world benchmark problem, the variance is a reasonable criterion for both active data selection and test point rejection.

Kernel Bayes' rule: Bayesian inference with positive definite kernels

- Computer ScienceJ. Mach. Learn. Res.
- 2013

A kernel method for realizing Bayes' rule is proposed, based on representations of probabilities in reproducing kernel Hilbert spaces, including Bayesian computation without likelihood and filtering with a nonparametric state-space model.