Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring

  title={Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring},
  author={Sungjin Ahn and Anoop Korattikara Balan and Max Welling},
In this paper we address the following question: “Can we approximately sample from a Bayesian posterior distribution if we are only allowed to touch a small mini-batch of data-items for every sample we generate?”. An algorithm based on the Langevin equation with stochastic gradients (SGLD) was previously proposed to solve this, but its mixing rate was slow. By leveraging the Bayesian Central Limit Theorem, we extend the SGLD algorithm so that at high mixing rates it will sample from a normal… CONTINUE READING
Highly Influential
This paper has highly influenced 19 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 183 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 2 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.


Publications citing this paper.
Showing 1-10 of 124 extracted citations

Measuring Sample Quality with Kernels

View 12 Excerpts
Highly Influenced

Stochastic Gradient Descent as Approximate Bayesian Inference

Journal of Machine Learning Research • 2017
View 14 Excerpts
Highly Influenced

183 Citations

Citations per Year
Semantic Scholar estimates that this publication has 183 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 11 references

A tutorial on adaptive MCMC

Statistics and Computing • 2008
View 5 Excerpts
Highly Influenced

Maximum likelihood estimation using the empirical fisher information matrix

W. A. Scott
Journal of Statistical Computation and Simulation, • 2002
View 2 Excerpts

Stochastic approximation with two time scales

V. S. Borkar
Systems and Control Letters, • 1997
View 1 Excerpt

Similar Papers

Loading similar papers…