Corpus ID: 121439686

Understanding belief propagation and its generalizations

@inproceedings{Yedidia2003UnderstandingBP,
  title={Understanding belief propagation and its generalizations},
  author={Jonathan S. Yedidia and William T. Freeman and Yair Weiss},
  year={2003}
}
"Inference" problems arise in statistical physics, computer vision, error-correcting coding theory, and AI. We explain the principles behind the belief propagation (BP) algorithm, which is an efficient way to solve inference problems based on passing local messages. We develop a unified approach, with examples, notation, and graphical models borrowed from the relevant disciplines.We explain the close connection between the BP algorithm and the Bethe approximation of statistical physics. In… Expand
Constructing free-energy approximations and generalized belief propagation algorithms
TLDR
This work explains how to obtain region-based free energy approximations that improve the Bethe approximation, and corresponding generalized belief propagation (GBP) algorithms, and describes empirical results showing that GBP can significantly outperform BP. Expand
Active Inference, Belief Propagation, and the Bethe Approximation
TLDR
This work reformulated the approximate inference process using the so-called Bethe approximation, and relates the better performance of the Bethe agent to more accurate predictions about the consequences of its own actions, which extends the application range of active inference to more complex behavioral tasks. Expand
Message-Passing Algorithms for Inference and Optimization “ Belief Propagation ” and “ Divide and Concur ”
Message-passing algorithms can solve a wide variety of optimization, inference, and constraint satisfaction problems. The algorithms operate on factor graphs that visually represent and specify theExpand
$\alpha$ Belief Propagation for Approximate Inference
TLDR
An interpretable belief propagation algorithm that is motivated by minimization of a localized $\alpha$-divergence is derived that generalizes standard BP and proves and offers the convergence conditions for $\ alpha$-BP. Expand
Message-Passing Algorithms for Optimization and Inference: Belief Propagation and Divide char38 Concur
Message-passing algorithms can solve a wide variety of optimization, inference, and constraint satisfaction problems. The algorithms operate on factor graphs that visually represent and specify theExpand
Message-Passing Algorithms for Inference and Optimization
Message-passing algorithms can solve a wide variety of optimization, inference, and constraint satisfaction problems. The algorithms operate on factor graphs that visually represent and specify theExpand
The Max-Product Algorithm in Pairwise Markov Random Field
“Inference” problem arise in computer vision, AI, statistical physics and coding theory. The rationale behind the belief propagation is an efficient way to solve inference problems by propagatingExpand
The Linearization of Belief Propagation on Pairwise Markov Networks
TLDR
The present paper generalizes all prior work and derives an approach that approximates loopy BP on any pairwise MRF with the problem of solving a linear equation system, which combines exact convergence guarantees and a fast matrix implementation with the ability to model heterogenous networks. Expand
The Linearization of Belief Propagation on Pairwise Markov Random Fields
TLDR
The present paper generalizes all prior work and derives an approach that approximates loopy BP on any pairwise MRF with the problem of solving a linear equation system, which combines exact convergence guarantees and a fast matrix implementation with the ability to model heterogenous networks. Expand
Improving probabilistic inference in graphical models with determinism and cycles
TLDR
Generalized arc-consistency Expectation Maximization Message-Passing (GEM-MP), a novel message-passing approach to inference in an extended factor graph that combines constraint programming techniques with variational methods, is introduced. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 23 REFERENCES
Loopy Belief Propagation for Approximate Inference: An Empirical Study
TLDR
This paper compares the marginals computed using loopy propagation to the exact ones in four Bayesian network architectures, including two real-world networks: ALARM and QMR, and finds that the loopy beliefs often converge and when they do, they give a good approximation to the correct marginals. Expand
Expectation Propagation for approximate Bayesian inference
  • T. Minka
  • Computer Science, Mathematics
  • UAI
  • 2001
TLDR
Expectation Propagation approximates the belief states by only retaining expectations, such as mean and varitmce, and iterates until these expectations are consistent throughout the network, which makes it applicable to hybrid networks with discrete and continuous nodes. Expand
Belief Optimization for Binary Networks: A Stable Alternative to Loopy Belief Propagation
TLDR
A novel inference algorithm for arbitrary, binary, undirected graphs that directly descend on the Bethe free energy, which is ideally suited for learning graphical models from data. Expand
An Idiosyncratic Journey Beyond Mean Field Theory
I try to clarify the relationships between different ways of deriving or correcting mean field theory, and present ”translations” between the language of physicists and that of computer scientists.Expand
The Generalized Distributive Law and Free Energy Minimization
In an important recent paper, Yedidia, Freeman, and Weiss [7] showed that there is a close connection between the belief propagation algorithm for probabilistic inference and the Bethe-KikuchiExpand
Turbo Decoding as an Instance of Pearl's "Belief Propagation" Algorithm
TLDR
It is shown that Pearl's algorithm can be used to routinely derive previously known iterative, but suboptimal, decoding algorithms for a number of other error-control systems, including Gallager's low-density parity-check codes, serially concatenated codes, and product codes. Expand
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
  • S. Geman, D. Geman
  • Mathematics, Medicine
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 1984
TLDR
The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation. Expand
Factor graphs and the sum-product algorithm
TLDR
A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function. Expand
The generalized distributive law
TLDR
Although this algorithm is guaranteed to give exact answers only in certain cases (the "junction tree" condition), unfortunately not including the cases of GTW with cycles or turbo decoding, there is much experimental evidence, and a few theorems, suggesting that it often works approximately even when it is not supposed to. Expand
Learning Low-Level Vision
TLDR
A learning-based method for low-level vision problems—estimating scenes from images with Bayesian belief propagation, applied to the “super-resolution” problem (estimating high frequency details from a low-resolution image), showing good results. Expand
...
1
2
3
...