• Corpus ID: 85485755

The Linearization of Pairwise Markov Networks

@article{Gatterbauer2015TheLO,
  title={The Linearization of Pairwise Markov Networks},
  author={Wolfgang Gatterbauer},
  journal={ArXiv},
  year={2015},
  volume={abs/1502.04956}
}
Belief Propagation (BP) allows to approximate exact probabilistic inference in graphical models, such as Markov networks (also called Markov random fields, or undirected graphical models). However, no exact convergence guarantees for BP are known, in general. Recent work has proposed to approximate BP by linearizing the update equations around default values for the special case when all edges in the Markov network carry the same symmetric, doubly stochastic potential. This linearization has… 

Figures from this paper

ZooBP: Belief Propagation for Heterogeneous Networks

ZooBP is proposed, a method to perform fast BP on undirected heterogeneous graphs with provable convergence guarantees, and has the following advantages: Generality: It works on heterogeneity graphs with multiple types of nodes and edges, and gives a closed-form solution as well as convergence guarantees.

Factorized Graph Representations for Semi-Supervised Learning from Sparse Data

This work suggests a principled and scalable method for directly estimating the compatibilities from a sparsely labeled graph and refers to algebraic amplification as the underlying idea of leveraging algebraic properties of an algorithm's update equations to amplify sparse signals in data.

Mining Anomalies using Static and Dynamic Graphs

The completed work can detect anomalous dense subgraphs and edges in near real-time, by only storing a small synopsis of the graph seen so far and requiring no supervision, and shows how to early warn against user-labeled anomalies in the presence of confounding interventions.

References

SHOWING 1-10 OF 25 REFERENCES

Correctness of Local Probability Propagation in Graphical Models with Loops

An analytical relationship is derived between the probabilities computed using local propagation and the correct marginals and a category of graphical models with loops for which local propagation gives rise to provably optimal maximum a posteriori assignments (although the computed marginals will be incorrect).

Linearized and Single-Pass Belief Propagation

Linized Belief Propagation (LinBP), a linearization of BP that allows a closed-form solution via intuitive matrix equations and, thus, comes with exact convergence guarantees and allows fast incremental updates in dynamic networks.

Residual Belief Propagation: Informed Scheduling for Asynchronous Message Passing

RBP is proposed, a novel, easy-to-implement, asynchronous propagation algorithm that schedules messages in an informed way that pushes down a bound on the distance from the fixed point and demonstrates the superiority of RBP over state-of-the-art methods for a variety of challenging synthetic and real-life problems.

Loopy Belief Propagation: Convergence and Effects of Message Errors

This analysis leads to convergence conditions for traditional BP message Passing, and both strict bounds and estimates of the resulting error in systems of approximate BP message passing.

Understanding belief propagation and its generalizations

It is shown that BP can only converge to a fixed point that is also a stationary point of the Bethe approximation to the free energy, which enables connections to be made with variational approaches to approximate inference.

Sufficient Conditions for Convergence of the Sum–Product Algorithm

Novel conditions are derived that guarantee convergence of the sum-product algorithm to a unique fixed point, irrespective of the initial messages, for parallel (synchronous) updates and this bound outperforms existing bounds.

Efficient Belief Propagation for Early Vision

Algorithmic techniques are presented that substantially improve the running time of the loopy belief propagation approach and reduce the complexity of the inference algorithm to be linear rather than quadratic in the number of possible labels for each pixel, which is important for problems such as image restoration that have a large label set.

Spectral redemption in clustering sparse networks

A way of encoding sparse data using a “nonbacktracking” matrix, and it is shown that the corresponding spectral algorithm performs optimally for some popular generative models, including the stochastic block model.

Focused Belief Propagation for Query-Specific Inference

Given the variable that the user actually cares about, this work shows how to quantify edge importance in graphical models and to significantly speed up inference by focusing computation on important parts of the model.

Factor graphs and the sum-product algorithm

A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function.