Factor graphs and the sum-product algorithm

  title={Factor graphs and the sum-product algorithm},
  author={Frank R. Kschischang and Brendan J. Frey and Hans-Andrea Loeliger},
  journal={IEEE Trans. Inf. Theory},
Algorithms that must deal with complicated global functions of many variables often exploit the manner in which the given functions factor as a product of "local" functions, each of which depends on a subset of the variables. [] Key Method Following a single, simple computational rule, the sum-product algorithm computes-either exactly or approximately-various marginal functions derived from the global function. A wide variety of algorithms developed in artificial intelligence, signal processing, and digital…

Convergence Analysis of Reweighted Sum-Product Algorithms

This paper studies the convergence and stability properties of the family of reweighted sum-product algorithms, a generalization of the widely used sum-Product or belief propagation algorithm, in which messages are adjusted with graph-dependent weights.

The Entropy Message Passing: A New Algorithm Over Factor Graphs

It is shown how the EMP can be used for efficient computation of the model entropy and of the complex expressions which appear in the Expectation Maximization and the gradient descent algorithms.

Discrete geometric analysis of message passing algorithm on graphs

The primary contribution of this thesis is the discovery of a formula that establishes the relation between the LBP, the Bethe free energy and the graph zeta function, and provides new techniques for analysis of the L BP algorithm.

On the Xed Points of the Max-product Algorithm on the Xed Points of the Max-product Algorithm

Here it is proved that the posterior probability of the max-product assignment is guaranteed to be greater than all other assignments in a particular large region around that assignment in any subset of nodes that form no more than a single loop in the graph.

An Introduction to factor graphs - Signal Processing Magazine, IEEE

A unified perspective of probability propagation in Bayesian networks and artificial intelligence is given in terms of (Forney-style) factor graphs.

Faster Algorithms for Max-Product Message-Passing

It is demonstrated that message-passing operations in graphical models with maximal cliques that are larger than their constituent factors are equivalent to some variant of matrix multiplication in the tropical semiring, for which an O(N2.5) expected-case solution is offered.

A factor-graph approach to Lagrangian and Hamiltonian dynamics

  • P. Vontobel
  • Computer Science
    2011 IEEE International Symposium on Information Theory Proceedings
  • 2011
This paper aims at extending the field of possible applications of factor graphs to Lagrangian and Hamiltonian dynamics, and finds that duality results for factor graphs allow to easily derive Noether's theorem.

The sum-product algorithm: algebraic independence and computational aspects

This work shows that the variable sets involved in an acyclic factorization satisfy a relation that is a natural generalization of probability-theoretic independence, and shows that for the Boolean semiring the sum-product algorithm reduces to a classical algorithm of database theory.

Message-Passing Algorithms for Inference and Optimization

The Divide and Concur algorithm is a projection-based constraint satisfaction algorithm that deals naturally with continuous variables, and converges to exact answers for problems where the solution sets of the constraints are convex.

Double-edge factor graphs: Definition, properties, and examples

This paper introduces a class of factor graphs, called double-edge factor graphs (DE-FGs), which allow local functions to be complex-valued and only require them, in some suitable sense, to be positive semi-definite kernel functions.



The generalized distributive law

Although this algorithm is guaranteed to give exact answers only in certain cases (the "junction tree" condition), unfortunately not including the cases of GTW with cycles or turbo decoding, there is much experimental evidence, and a few theorems, suggesting that it often works approximately even when it is not supposed to.

Codes and Decoding on General Graphs

It is showed that many iterative decoding algorithms are special cases of two generic algorithms, the min-sum and sum-product algorithms, which also include non-iterative algorithms such as Viterbi decoding.

Codes on graphs: normal realizations

  • G. Forney
  • Computer Science
    2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060)
  • 2000
This work shows that any Wiberg-type realization may be put into normal form without essential change in its graph or its decoding complexity, and shows that an appropriately defined dual of a group or linear normal realization realizes the dual group orlinear code.

Turbo Decoding as an Instance of Pearl's "Belief Propagation" Algorithm

It is shown that Pearl's algorithm can be used to routinely derive previously known iterative, but suboptimal, decoding algorithms for a number of other error-control systems, including Gallager's low-density parity-check codes, serially concatenated codes, and product codes.

A recursive approach to low complexity codes

It is shown that choosing a transmission order for the digits that is appropriate for the graph and the subcodes can give the code excellent burst-error correction abilities.

Codes and iterative decoding on general graphs

The main thesis of the present paper is that, with respect to iterative decoding, the natural way of describing a code is by means of a Tanner graph, which may be viewed as a generalized trellis.

Contribution to the discussion of the paper by Steffen L. Lauritzen and David Spiegelhalter: "Local Computations with Probabilities on Graphical Structures and their Application to Expert Systems"

This work exploits a range of local representations for the joint probability distribution, combined with topological changes to the original network termed `marrying' and `filling-in', which allows efficient algorithms for transfer between representations, providing rapid absorption and propagation of evidence.

Iterative decoding of binary block and convolutional codes

Using log-likelihood algebra, it is shown that any decoder can be used which accepts soft inputs-including a priori values-and delivers soft outputs that can be split into three terms: the soft channel and aPriori inputs, and the extrinsic value.

Variational Learning in Nonlinear Gaussian Belief Networks

This article presents a general variational method that maximizes a lower bound on the likelihood of a training set and gives results on two visual feature extraction problems.

Computers and Intractability: A Guide to the Theory of NP-Completeness

It is proved here that the number ofrules in any irredundant Horn knowledge base involving n propositional variables is at most n 0 1 times the minimum possible number of rules.