• Corpus ID: 12908204

Fast Convergent Algorithms for Expectation Propagation Approximate Bayesian Inference

@inproceedings{Seeger2011FastCA,
  title={Fast Convergent Algorithms for Expectation Propagation Approximate Bayesian Inference},
  author={Matthias W. Seeger and Hannes Nickisch},
  booktitle={AISTATS},
  year={2011}
}
We propose a novel algorithm to solve the expectation propagation relaxation of Bayesian inference for continuous-variable graphical models. In contrast to most previous algorithms, our method is provably convergent. By marrying convergent EP ideas from [15] with covariance decoupling techniques [23, 13], it runs at least an order of magnitude faster than the most common EP solver. 

Figures from this paper

Expectation Propagation for Bayesian Inference

  • Computer Science
  • 2014
TLDR
This paper introduces Expectation Propagation as a variant of message-passing where each of the individual messages are approximated while being transferred and starts with Assumed Density Filtering.

Deterministic Approximation Methods in Bayesian Inference

TLDR
This seminar paper gives an introduction to the field of deterministic approximate inference by describing three algorithms: Variational Factorization, Variational Bounds and Expectation Propagation and analyzing the approximations obtained by the three algorithms in terms of convergence and accuracy.

Expectation consistent approximate inference: Generalizations and convergence

TLDR
A generalization of Opper and Winther's expectation consistent (EC) approximate inference method, called Generalized Expectation Consistency (GEC), which can be applied to both maximum a posteriori (MAP) and minimum mean squared error (MMSE) estimation.

Learning and Free Energy in Expectation Consistent Approximate Inference

TLDR
The combined algorithm is called EM-EC and is shown to have a simple variational free energy interpretation and provide a computationally efficient and general approach to a number of learning problems with hidden states including empirical Bayesian forms of regression, classification, compressed sensing, and sparse Bayesian learning.

Fast Bayesian Inference for Non-Conjugate Gaussian Process Regression

We present a new variational inference algorithm for Gaussian process regression with non-conjugate likelihood functions, with application to a wide array of problems including binary and multi-class

Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees

TLDR
This work forms natural gradient variational inference, expectation propagation, and posterior linearisation as extensions of Newton’s method for optimising the parameters of a Bayesian posterior distribution under the framework of numerical optimisation, and provides new insights into the connections between various inference schemes.

Tilted Variational Bayes

TLDR
The method combines some of the benets of VB and EP: it can be used with light-tailed likelihoods (where traditional VB fails), and it provides a lower bound on the marginal likelihood.

Effective Bayesian inference for sparse factor analysis models

TLDR
A novel `Dense Message Passing' algorithm (DMP) is described which achieves near-optimal performance on synthetic data generated from this model and provides an estimate of the marginal likelihood which can be used for hyperparameter optimisation.

Approximate Gaussian Integration using Expectation Propagation

TLDR
An empirical study of the utility of Expectation Propagation as an approximate integration method for Gaussian cumulative probabilities finds that in this polyhedral case, EP's answer can be almost arbitrarily wrong.

References

SHOWING 1-10 OF 27 REFERENCES

Expectation Propagation for approximate Bayesian inference

TLDR
Expectation Propagation approximates the belief states by only retaining expectations, such as mean and varitmce, and iterates until these expectations are consistent throughout the network, which makes it applicable to hybrid networks with discrete and continuous nodes.

Gaussian Covariance and Scalable Variational Inference

TLDR
This work provides theoretical and empirical insights into algorithmic and statistical consequences of low-rank covariance approximation errors on decision outcomes in nonlinear sequential Bayesian experimental design.

Convex variational Bayesian inference for large scale generalized linear models

We show how variational Bayesian inference can be implemented for very large generalized linear models. Our relaxation is proven to be a convex problem for any log-concave model. We provide a generic

Assessing Approximate Inference for Binary Gaussian Process Classification

TLDR
This work reviews and compares Laplace's method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model, and presents a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling.

Bayesian Inference and Optimal Design in the Sparse Linear Model

TLDR
This work shows how to obtain a good approximation to Bayesian analysis efficiently, using the Expectation Propagation method, and addresses the problems of optimal design and hyperparameter estimation.

Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models

  • M. SeegerH. Nickisch
  • Computer Science
    Sampling-based Optimization in the Presence of Uncertainty
  • 2009
TLDR
A long-standing open question about variational Bayesian inference for continuous variable models is settled, and the Gaussian lower bound relaxation is proved to be a convex optimization problem, if and only if the posterior mode is found by convex programming.

Approximations for Binary Gaussian Process Classification

We provide a comprehensive overview of many recent algorithms for approximate inference in Gaussian process models for probabilistic binary classification. The relationships between several

Bayesian Experimental Design of Magnetic Resonance Imaging Sequences

TLDR
This work proposes the first Bayesian experimental design framework for magnetic resonance imaging, and proposes a novel scalable variational inference algorithm that requires large-scale approximate inference for dense, non-Gaussian models.

Large Scale Bayesian Inference and Experimental Design for Sparse Linear Models

TLDR
This work shows how higher-order Bayesian decision-making problems, such as optimizing image acquisition in magnetic resonance scanners, can be addressed by querying the SLM posterior covariance, unrelated to the density's mode, and proposes a scalable algorithmic framework with which SLM posteriors over full, high-resolution images can be approximated for the first time.

Gaussian Processes for Ordinal Regression

We present a probabilistic kernel approach to ordinal regression based on Gaussian processes. A threshold model that generalizes the probit function is used as the likelihood function for ordinal