Variational Message Passing

  title={Variational Message Passing},
  author={John M. Winn and Christopher M. Bishop},
  journal={Journal of Machine Learning Research},
This paper presents Variational Message Passing (VMP), a general purpose algorithm for applying variational inference to a Bayesian Network. Like belief propagation, Variational Message Passing proceeds by passing messages between nodes in the graph and updating posterior beliefs using local operations at each node. Each such update increases a lower bound on the log evidence (unless already at a local maximum). In contrast to belief propagation, VMP can be applied to a very general class of… CONTINUE READING
Highly Influential
This paper has highly influenced 63 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 640 citations. REVIEW CITATIONS


Publications citing this paper.
Showing 1-10 of 369 extracted citations

Learning Inference Models for Computer Vision

ArXiv • 2017
View 12 Excerpts
Highly Influenced

MAP inference in dynamic hybrid Bayesian networks

Progress in Artificial Intelligence • 2017
View 10 Excerpts
Highly Influenced

Variational Shift Invariant Probabilistic PCA for Face Recognition

18th International Conference on Pattern Recognition (ICPR'06) • 2006
View 6 Excerpts
Highly Influenced

Sparse Bayesian Compressed Spectrum Sensing Under Gaussian Mixture Noise

IEEE Transactions on Vehicular Technology • 2018
View 7 Excerpts
Highly Influenced

640 Citations

Citations per Year
Semantic Scholar estimates that this publication has 640 citations based on the available data.

See our FAQ for additional information.

Similar Papers

Loading similar papers…