• Corpus ID: 220250337

# $\alpha$ Belief Propagation for Approximate Inference

@article{Liu2020alphaBP,
title={\$\alpha\$ Belief Propagation for Approximate Inference},
author={Dong Liu and Minh Th{\a}nh Vu and Zuxing Li and Lars Kildeh{\o}j Rasmussen},
journal={arXiv: Machine Learning},
year={2020}
}`
• Published 27 June 2020
• Computer Science
• arXiv: Machine Learning
Belief propagation (BP) algorithm is a widely used message-passing method for inference in graphical models. BP on loop-free graphs converges in linear time. But for graphs with loops, BP's performance is uncertain, and the understanding of its solution is limited. To gain a better understanding of BP in general graphs, we derive an interpretable belief propagation algorithm that is motivated by minimization of a localized $\alpha$-divergence. We term this algorithm as $\alpha$ belief…

## References

SHOWING 1-10 OF 36 REFERENCES

BP converges quickly to the global optimum of the Bethe free energy for Ising models on arbitrary graphs, as long as the Ising model is \emph{ferromagnetic} (i.e. neighbors prefer to be aligned).
• Computer Science
• 2003
It is shown that BP can only converge to a fixed point that is also a stationary point of the Bethe approximation to the free energy, which enables connections to be made with variational approaches to approximate inference.
• Computer Science
NIPS
• 2000
It is shown that BP can only converge to a stationary point of an approximate free energy, known as the Bethe free energy in statistical physics, and generalized belief propagation (GBP) versions of these Kikuchi approximations are derived.
This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible, and is found to be convincingly better than rival approximation techniques: Monte Carlo, Laplace's method, and variational Bayes.
• Computer Science
IEEE Transactions on Information Theory
• 2013
Stochastic belief propagation is proposed, an adaptively randomized version of the BP message updates in which each node passes randomly chosen information to each of its neighbors, and can provably yield reductions in computational and communication complexities for various classes of graphical models.
Expectation Propagation approximates the belief states by only retaining expectations, such as mean and varitmce, and iterates until these expectations are consistent throughout the network, which makes it applicable to hybrid networks with discrete and continuous nodes.
• Mathematics
NIPS
• 2002
Fractional belief propagation is formulated in terms of a family of approximate free energies, which includes the Bethe free energy and the naive mean-field free as special cases, and using the linear response correction of the clique marginals, the scale parameters can be tuned.
• Computer Science
2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo)
• 2010
This paper proposes a new algorithm for the linear least squares problem where the unknown variables are constrained to be in a finite set and uses the minimum mean square error (MMSE) detection to yield a pseudo prior information on each variable.
• Computer Science
2019 53rd Asilomar Conference on Signals, Systems, and Computers
• 2019
This work uses Graph Neural Networks (GNNs) to learn a message-passing algorithm that solves inference tasks and demonstrates the efficacy of this inference approach by training GNNs on a collection of graphical models and showing that they substantially outperform belief propagation on loopy graphs.
An analytical relationship is derived between the probabilities computed using local propagation and the correct marginals and a category of graphical models with loops for which local propagation gives rise to provably optimal maximum a posteriori assignments (although the computed marginals will be incorrect).