MAP estimation via agreement on trees: message-passing and linear programming

@article{Wainwright2005MAPEV,
  title={MAP estimation via agreement on trees: message-passing and linear programming},
  author={M. Wainwright and T. Jaakkola and A. Willsky},
  journal={IEEE Transactions on Information Theory},
  year={2005},
  volume={51},
  pages={3697-3717}
}
We develop and analyze methods for computing provably optimal maximum a posteriori probability (MAP) configurations for a subclass of Markov random fields defined on graphs with cycles. By decomposing the original distribution into a convex combination of tree-structured distributions, we obtain an upper bound on the optimal value of the original problem (i.e., the log probability of the MAP assignment) in terms of the combined optimal values of the tree problems. We prove that this upper bound… Expand
Message-Passing Algorithms for MAP Estimation Using DC Programming
TLDR
The linear programming formulation of MAP is analyzed through the lens of di!erence of convex functions (DC) programming, and the concaveconvex procedure (CCCP) is used to develop e"cient message-passing solvers. Expand
Revisiting MAP Estimation, Message Passing and Perfect Graphs
TLDR
The article claims that max-product linear programming (MPLP) message passing techniques of Globerson and Jaakkola (2007) are guaranteed to solve these problems exactly and efficiently, and investigates this claim, shows that it does not hold, and repairs it with alternative message passing algorithms. Expand
Message-passing for Graph-structured Linear Programs: Proximal Methods and Rounding Schemes
TLDR
A family of super-linearly convergent algorithms for solving linear programming (LP) relaxations, based on proximal minimization schemes using Bregman divergences, and proposes graph-structured randomized rounding schemes applicable to iterative LP-solving algorithms in general. Expand
Approximate inference: decomposition methods with applications to networks
Markov random field (MRF) model provides an elegant probabilistic framework to formulate inter-dependency between a large number of random variables. In this thesis, we present a new approximationExpand
Approximate message-passing inference algorithm
In a recent result, Weitz (D. Weitz, 2006) established equivalence between the marginal distribution of a node, say G, in any binary pair-wise Markov random field (MRF), say G, with the marginalExpand
Message Passing for Maximum Weight Independent Set
TLDR
If max product is started from the natural initialization of uninformative messages, it always solves the correct LP, if it converges, and a simple modification of max product becomes gradient descent on (a smoothed version of) the dual of the LP, and converges to the dual optimum. Expand
MAP Estimation, Message Passing, and Perfect Graphs
TLDR
A general graph framework is provided for determining when MAP estimation in any graphical model is in P, has an integral linear programming relaxation and is exactly recoverable by message passing. Expand
Convergent and Correct Message Passing Schemes for Optimization Problems over Graphical Models
TLDR
This work provides a simple derivation of a new family of message passing algorithms by "splitting" the factors of the authors' graphical model and proves that, for any objective function that attains its maximum value over its domain, this newfamily of message passed algorithms always contains a message passing scheme that guarantees correctness upon convergence to a unique estimate. Expand
Dynamic Tree Block Coordinate Ascent
TLDR
A novel Linear Programming (LP) based algorithm, called Dynamic Tree-Block Coordinate Ascent (DT-BCA), for performing maximum a posteriori (MAP) inference in probabilistic graphical models, which dynamically chooses regions of the factor graph on which to focus message-passing efforts. Expand
Local approximate inference algorithms
TLDR
It is shown that the normalized log-partition function (also known as free-energy) for a class of {\em regular} MRFs will converge to a limit, that is computable to an arbitrary accuracy. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 54 REFERENCES
Tree consistency and bounds on the performance of the max-product algorithm and its generalizations
TLDR
A novel perspective on the max-product algorithm is provided, based on the idea of reparameterizing the distribution in terms of so-called pseudo-max-marginals on nodes and edges of the graph, to provide conceptual insight into the algorithm in application to graphs with cycles. Expand
On the optimality of solutions of the max-product belief-propagation algorithm in arbitrary graphs
TLDR
It is shown that the assignment based on a fixed point of max-product is a "neighborhood maximum" of the posterior probabilities: the posterior probability of the max- product assignment is guaranteed to be greater than all other assignments in a particular large region around that assignment. Expand
On the Optimality of Tree-reweighted Max-product Message-passing
TLDR
This paper demonstrates how it is possible to identify part of the optimal solution—i.e., a provably optimal solution for a subset of nodes— without knowing a complete solution, and establishes that for submodular functions, a WTA fixed point always yields a globally optimal solution. Expand
A new class of upper bounds on the log partition function
TLDR
A new class of upper bounds on the log partition function of a Markov random field (MRF) is introduced, based on concepts from convex duality and information geometry, and the Legendre mapping between exponential and mean parameters is exploited. Expand
Tree-based reparameterization framework for analysis of sum-product and related algorithms
We present a tree-based reparameterization (TRP) framework that provides a new conceptual view of a large class of algorithms for computing approximate marginals in graphs with cycles. This classExpand
Understanding belief propagation and its generalizations
TLDR
It is shown that BP can only converge to a fixed point that is also a stationary point of the Bethe approximation to the free energy, which enables connections to be made with variational approaches to approximate inference. Expand
Convergent Tree-Reweighted Message Passing for Energy Minimization
  • V. Kolmogorov
  • Computer Science, Medicine
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2006
TLDR
This paper develops a modification of the recent technique proposed by Wainwright et al. (Nov. 2005), called sequential tree-reweighted message passing, which outperforms both the ordinary belief propagation and tree- reweighted algorithm in both synthetic and real problems. Expand
Using linear programming to Decode Binary linear codes
TLDR
The definition of a pseudocodeword unifies other such notions known for iterative algorithms, including "stopping sets," "irreducible closed walks," "trellis cycles," "deviation sets," and "graph covers," which is a lower bound on the classical distance. Expand
Information geometry on hierarchy of probability distributions
  • S. Amari
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 2001
TLDR
The orthogonal decomposition of an exponential family or mixture family of probability distributions has a natural hierarchical structure is given and is important for extracting intrinsic interactions in firing patterns of an ensemble of neurons and for estimating its functional connections. Expand
Linear Programming-Based Decoding of Turbo-Like Codes and its Relation to Iterative Approaches
TLDR
This paper gives an iterative de coder whose output is equivalent to that of the LP decoder, and extends the ML certificateproperty to the more efficient iterativetree reweighted max-product message-passing algorithm developed by Wainwright, Jaakkola, and Willsky. Expand
...
1
2
3
4
5
...