Quadratization and Roof Duality of Markov Logic Networks

@article{Nijs2016QuadratizationAR,
  title={Quadratization and Roof Duality of Markov Logic Networks},
  author={Roderick de Nijs and Christian Landsiedel and Dirk Wollherr and Martin Buss},
  journal={J. Artif. Intell. Res.},
  year={2016},
  volume={55},
  pages={685-714}
}
This article discusses the quadratization of Markov Logic Networks, which enables efficient approximate MAP computation by means of maximum ows. The procedure relies on a pseudo-Boolean representation of the model, and allows handling models of any order. The employed pseudo-Boolean representation can be used to identify problems that are guaranteed to be solvable in low polynomial-time. Results on common benchmark problems show that the proposed approach finds optimal assignments for most… 

Figures and Tables from this paper

Semantic Mapping for Autonomous Robots in Urban Environments
TLDR
The estimation of spatial relations between objects using Probabilistic Logic, including a novel inference method for Markov Logic Networks, and the benefits of combining different sources of semantic information with sensor data are shown in a scene interpretation and a semantic localization task.

References

SHOWING 1-10 OF 43 REFERENCES
Fully Parallel Inference in Markov Logic Networks
TLDR
A parallel grounding algorithm is proposed that partitions the Markov logic network based on its corresponding join graph; each partition is ground independently and in parallel, and is well-suited to other, more efficient parallel inference techniques.
Maximizing a supermodular pseudoboolean function: A polynomial algorithm for supermodular cubic functions
Cutting Plane MAP Inference for Markov Logic
TLDR
Cutting Plane Inference is presented for MAP inference in Markov Logic, which incrementally solves partial Ground Markov Networks, adding formulae only if they are violated in the current solution.
Solving MAP Exactly by Searching on Compiled Arithmetic Circuits
TLDR
This paper implements a branch-and-bound search where the bounds are computed using linear-time operations on the compiled arithmetic circuit of Bayesian networks into arithmetic circuits, which can circumvent treewidth-imposed limits by exploiting the local structure present in the network.
Lifted MAP Inference for Markov Logic Networks
TLDR
This paper presents a new approach for lifted MAP inference in Markov Logic Networks (MLNs) that reduces lifted inference to propositional inference, and can be easily applied to an arbitrary MLN by simply grounding all of its shared terms.
Pseudo-Boolean optimization
On the supermodular knapsack problem
TLDR
This paper introduces binary knapsack problems where the objective function is nonlinear, and investigates their Lagrangean and continuous relaxations and comments on the complexity of recognizing supermodular functions.
RockIt: Exploiting Parallelism and Symmetry for MAP Inference in Statistical Relational Models
TLDR
RockIt is a maximum a-posteriori (MAP) query engine for statistical relational models that outperforms the state-of-the-art systems Alchemy, Markov TheBeast, and Tuffy both in terms of efficiency and quality of results.
Model Reductions for Inference: Generality of Pairwise, Binary, and Planar Factor Graphs
TLDR
This work formalizes a notion of “simple reduction” for the problem of inferring marginal probabilities and considers whether it is possible to “simply reduce” marginal inference from general discrete factor graphs to factor graphs in each of these seven subclasses.
A faster strongly polynomial time algorithm for submodular function minimization
TLDR
A combinatorial algorithm that runs in O(n5EO  +  n6) time, where EO is the time to evaluate f(S) for some submodular function f defined on a set V with n elements is given.
...
...