• Corpus ID: 1571829

Finding the M Most Probable Configurations using Loopy Belief Propagation

@inproceedings{Yanover2003FindingTM,
  title={Finding the M Most Probable Configurations using Loopy Belief Propagation},
  author={Chen Yanover and Yair Weiss},
  booktitle={NIPS 2003},
  year={2003}
}
Loopy belief propagation (BP) has been successfully used in a number of difficult graphical models to find the most probable configuration of the hidden variables. In applications ranging from protein folding to image analysis one would like to find not just the best configuration but rather the top M. While this problem has been solved using the junction tree formalism, in many real world problems the clique size in the junction tree is prohibitively large. In this work we address the problem… 

Figures from this paper

Multiple Inference in Graphical Models

This paper surveys these inference methods from MAP to diverse, which involves finding a set of good solutions which are both having high probabilities and different from each other.

An LP View of the M-best MAP problem

The method put the M-best inference problem in the context of LP relaxations, which have recently received considerable attention and have proven useful in solving difficult inference problems, often finds the provably exact M best configurations for problems of high tree-width.

An Efficient Message-Passing Algorithm for the M-Best MAP Problem

This paper proposes an efficient message-passing based algorithm for solving the M-Best MAP problem that solves the recently proposed Linear Programming (LP) formulation of M- best MAP, while being orders of magnitude faster than a generic LP-solver.

Fast Fitness Improvements in Estimation of Distribution Algorithms Using Belief Propagation

This paper shows the way in which belief propagation can be inserted within the Estimation of Bayesian Networks Algorithm (EBNA) to increase the search capabilities by extracting information of the, computationally costly to learn, Bayesian network.

A parallel framework for loopy belief propagation

This paper presents a parallel approach for one of these inference-based algorithms, the loopy belief propagation algorithm for factor graphs, designed to provide an algorithm that can be executed in clusters of computers or multiprocessors in order to reduce the total execution time.

Optimization by Max-Propagation Using Kikuchi Approximations

It is shown that maximum GBP can be combined with a dynamic programming algorithm to find the most probable configurations of a f graphical model and how its different steps can be manipulated to influence the search for optimal solutions.

Diverse M-Best Solutions in Markov Random Fields

This paper proposes an algorithm for the Diverse M-Best problem, which involves finding a diverse set of highly probable solutions under a discrete probabilistic model and shows that for certain families of dissimilarity functions the authors can guarantee that these solutions can be found as easily as the MAP solution.

Searching for the M Best Solutions in Graphical Models

This paper presents a new algorithm m-A*, extending the well-known A* to the m-best task, and for the first time proves that all its desirable properties, including soundness, completeness and optimal efficiency, are maintained.

Computing the M Most Probable Modes of a Graphical Model

This work introduces the M-Modes problem for graphical models: predicting the M label configurations of highest probability that are at the same time local maxima of the probability landscape, and presents two algorithms for solving the MModes problem.

Dynamic quantization for belief propagation in sparse spaces

...

References

SHOWING 1-10 OF 19 REFERENCES

Understanding belief propagation and its generalizations

It is shown that BP can only converge to a fixed point that is also a stationary point of the Bethe approximation to the free energy, which enables connections to be made with variational approaches to approximate inference.

Approximate Inference and Protein-Folding

It is shown that finding a minimal energy side-chain configuration is equivalent to performing inference in an undirected graphical model, and this equivalence was used to assess the performance of approximate inference algorithms in a real-world setting.

Very loopy belief propagation for unwrapping phase images

This work proposes a new representation for the two-dimensional phase unwrapping problem, and shows that loopy belief propagation produces results that are superior to existing techniques, and Interestingly, the graph that is used has a very large number of very short cycles, supporting evidence that a large minimum cycle length is not needed for excellent results using belief propagation.

Exact MAP Estimates by (Hyper)tree Agreement

A method for computing provably exact maximum a posteriori (MAP) estimates for a subclass of problems on graphs with cycles and develops a tree-reweighted max-product algorithm for attempting to find convex combinations of tree-structured problems that share a common optimum.

Tree consistency and bounds on the performance of the max-product algorithm and its generalizations

A novel perspective on the max-product algorithm is provided, based on the idea of reparameterizing the distribution in terms of so-called pseudo-max-marginals on nodes and edges of the graph, to provide conceptual insight into the algorithm in application to graphs with cycles.

Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks

It is shown that Rao-Blackwellised particle filters (RBPFs) lead to more accurate estimates than standard PFs, and are demonstrated on two problems, namely non-stationary online regression with radial basis function networks and robot localization and map building.

Variational Probabilistic Inference and the QMR-DT Network

This work describes a variational approximation method for efficient inference in large-scale probabilistic models and evaluates the algorithm on a large set of diagnostic test cases, comparing the algorithm to a state-of-the-art stochastic sampling method.

Variational probabilistic inference and the QMR-DT database

This work describes a variational approximation method for eecient inference in large-scale probabilistic models and evaluates the algorithm to a state-of-the-art stochastic sampling method.

Probabilistic reasoning in intelligent systems - networks of plausible inference

  • J. Pearl
  • Computer Science
    Morgan Kaufmann series in representation and reasoning
  • 1989
The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic.

Learning to Perceive Transparency from the Statistics of Natural Scenes

It is suggested that transparency is the rational percept of a system that is adapted to the statistics of natural scenes, and a probabilistic model of images based on the qualitative statistics of derivative filters and "corner detectors" in natural scenes is presented.