Evaluation and Comparison Criteria for Approaches to Probabilistic Relational Knowledge Representation

@inproceedings{Beierle2011EvaluationAC,
  title={Evaluation and Comparison Criteria for Approaches to Probabilistic Relational Knowledge Representation},
  author={Christoph Beierle and Marc Finthammer and Gabriele Kern-Isberner and Matthias Thimm},
  booktitle={KI},
  year={2011}
}
In the past ten years, the areas of probabilistic inductive logic programming and statistical relational learning put forth a large collection of approaches to combine relational representations of knowledge with probabilistic reasoning. Here, we develop a series of evaluation and comparison criteria for those approaches and focus on the point of view of knowledge representation and reasoning. These criteria address abstract demands such as language aspects, the relationships to propositional… 

Comparing and Evaluating Approaches to Probabilistic Reasoning: Theory, Implementation, and Applications

The maximum entropy approach is featured as a powerful and elegant method that combines convenience with respect to knowledge representation with excellent inference properties in first-order probabilistic reasoning.

A Two-Level Approach to Maximum Entropy Model Computation for Relational Probabilistic Logic Based on Weighted Conditional Impacts

The notion of weighted conditional impacts is defined and a two-level approach for maximum entropy model computation based on them is presented and a generalized iterative scaling algorithm is used that fully abstracts from concrete worlds.

On Lifted Inference for a Relational Probabilistic Conditional Logic with Maximum Entropy Semantics

This paper investigates the relational probabilistic conditional logic FO-PCL whose semantics employs the principle of maximum entropy and derives a new syntactic criterion for parametric uniformity and develops an algorithm that transforms any FO- PCLknowledge base R into an equivalent knowledge base R′ that is parametrically uniform.

Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics

A new syntactic criterion for parametric uniformity is derived and an algorithm is developed that transforms any FO-PCL knowledge base into an equivalent knowledge base that is parametrically uniform that provides a basis for a simplified maximum entropy model computation.

Instantiation Restrictions for Relational Probabilistic Conditionals

The role of instantiation restrictions for two recently proposed semantics for relational probabilistic conditionals employing the maximum entropy principle is discussed. Aggregating semantics is

References

SHOWING 1-10 OF 25 REFERENCES

Comparing and Evaluating Approaches to Probabilistic Reasoning: Theory, Implementation, and Applications

The maximum entropy approach is featured as a powerful and elegant method that combines convenience with respect to knowledge representation with excellent inference properties in first-order probabilistic reasoning.

Relational Probabilistic Conditional Reasoning at Maximum Entropy

The formalisms discussed in this paper are relational extensions of a propositional conditional logic based on the principle of maximum entropy that can be used in different ways to realize model-based inference relations for first-order probabilistic knowledge bases.

On the Problem of Grounding a Relational Probabilistic Conditional Knowledge Base

This paper forms properties that can guide the search for suitable grounding operators and presents three operators the most sophisticated of which implements a stratified use of a specificity relation so that more specific information on objects is given priority over less specific information.

Novel Semantical Approaches to Relational Probabilistic Conditionals

This paper proposes novel semantical perspectives on first-order (or relational) probabilistic conditionals that are motivated by considering them as subjective, but population-based statements, and presents two inference operators that are shown to yield reasonable inferences.

Probabilistic Inductive Logic Programming

This chapter outlines three classical settings for inductive logic programming, namely learning from entailment, learning from interpretations, and learning from proofs or traces, and shows how they can be adapted to cover state-of-the-art statistical relational learning approaches.

Symbolic and Quantitative Approaches to Reasoning and Uncertainty

This volume contains the papers accepted for presentation at ECSQARU-93, the European Conference on Symbolicand Quantitative Approaches to Reasoning and Uncertainty, held at the University of Granada, Spain, November 8-10, 1993.

Probabilistic reasoning in intelligent systems - networks of plausible inference

  • J. Pearl
  • Computer Science
    Morgan Kaufmann series in representation and reasoning
  • 1989
The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic.

The Relationship of the Logic of Big-Stepped Probabilities to Standard Probabilistic Logics

By using Goguen and Burstall's notion of institutions for the formalization of logical systems, this work elaborate precisely which formal connections exist between big-stepped probabilities and standard probabilities, thereby establishing the exact relationships among these logics.

Lifted First-Order Belief Propagation

This paper proposes the first lifted version of a scalable probabilistic inference algorithm, belief propagation, based on first constructing a lifted network, where each node represents a set of ground atoms that all pass the same messages during belief propagation.

Characterizing the Principle of Minimum Cross-Entropy Within a Conditional-Logical Framework