• Corpus ID: 220363617

The FMRIB Variational Bayesian Inference Tutorial II: Stochastic Variational Bayes

  title={The FMRIB Variational Bayesian Inference Tutorial II: Stochastic Variational Bayes},
  author={Michael A. Chappell and Mark W. Woolrich},
Bayesian methods have proved powerful in many applications for the inference of model parameters from data. These methods are based on Bayes' theorem, which itself is deceptively simple. However, in practice the computations required are intractable even for simple cases. Hence methods for Bayesian inference have historically either been significantly approximate, e.g., the Laplace approximation, or achieve samples from the exact solution at significant computational expense, e.g., Markov Chain… 

Figures from this paper



The FMRIB Variational Bayes Tutorial: Variational Bayesian Inference for a Non-Linear Forward Model

The Variational Bayes (VB) method has been proposed (Attias 2000) that facilitates analytical calculations of the posterior distributions over a model that takes an iterative approach resembling an Expectation Maximisation method and whose convergence is guaranteed.

Auto-Encoding Variational Bayes

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

A Variational Baysian Framework for Graphical Models

This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model

The Generalized Reparameterization Gradient

The generalized reparameterization gradient is introduced, a method that extends the reparametersization gradient to a wider class of variational distributions and results in new Monte Carlo gradients that combine reparametization gradients and score function gradients.

Auto-Encoding Variational Bayes. arXiv 1312

  • 2013