• Computer Science, Mathematics
  • Published in J. Mach. Learn. Res. 2015

Distributed Bayesian Learning with Stochastic Natural Gradient Expectation Propagation and the Posterior Server

@article{Hasenclever2015DistributedBL,
  title={Distributed Bayesian Learning with Stochastic Natural Gradient Expectation Propagation and the Posterior Server},
  author={Leonard Hasenclever and Stefan Webb and Thibaut Lienart and Sebastian Vollmer and Balaji Lakshminarayanan and Charles Blundell and Yee Whye Teh},
  journal={ArXiv},
  year={2015},
  volume={abs/1512.09327}
}
This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in that it does not require any simplifying assumptions on the distribution of interest, beyond the existence of some Monte Carlo sampler for estimating the moments of the EP tilted distributions. Further… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 26 CITATIONS

Relativistic Monte Carlo

VIEW 7 EXCERPTS
CITES METHODS

PAC-BAYESIAN NEURAL NETWORK BOUNDS

  • 2019
VIEW 1 EXCERPT
CITES METHODS

Approximate Collapsed Gibbs Clustering with Expectation Propagation

VIEW 2 EXCERPTS
CITES METHODS & BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 62 REFERENCES

Large Scale Distributed Deep Networks

VIEW 16 EXCERPTS
HIGHLY INFLUENTIAL

Adam: A Method for Stochastic Optimization

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Deep learning with Elastic Averaging SGD

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Graphical Models, Exponential Families, and Variational Inference

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Expectation propagation for approximate inference in dynamic bayesian networks

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

A family of algorithms for approximate Bayesian inference

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL