• Corpus ID: 14585127

Efficient Markov Logic Inference for Natural Language Semantics

@inproceedings{Beltagy2014EfficientML,
  title={Efficient Markov Logic Inference for Natural Language Semantics},
  author={Iz Beltagy and Raymond J. Mooney},
  booktitle={StarAI@AAAI},
  year={2014}
}
Using Markov logic to integrate logical and distributional information in natural-language semantics results in complex inference problems involving long, complicated formulae. Current inference methods for Markov logic are ineffective on such problems. To address this problem, we propose a new inference algorithm based on SampleSearch that computes probabilities of complete formulae rather than ground atoms. We also introduce a modified closed-world assumption that significantly reduces the… 

Tables from this paper

Natural Language Semantics using Probabilistic Logic

This work proposes using probabilistic logic to represent natural language semantics combining the expressivity and the automated inference of logic, and the gradedness of distributional representations.

UTexas: Natural Language Semantics using Distributional Semantics and Probabilistic Logic

This work represents natural language semantics by combining logical and distributional information in probabilistic logic in Markov Logic Networks and Probabilistic Soft Logic for the RTE and STS tasks.

Representing Meaning with a Combination of Logical and Distributional Models

This article adopts a hybrid approach that combines logical and distributional semantics using probabilistic logic, specifically Markov Logic Networks and releases a lexical entailment data set of 10,213 rules extracted from the SICK data set, which is a valuable resource for evaluating lexical entailsment systems.

Markov Logic Networks for Natural Language Question Answering

The experiments, demonstrating a 15\% accuracy boost and a 10x reduction in runtime, suggest that the flexibility and different inference semantics of Praline are a better fit for the natural language question answering task.

On the Proper Treatment of Quantifiers in Probabilistic Logic Semantics

This paper shows how to formulate RTE inference problems in probabilistic logic in a way that takes the domain closure and closed-world assumptions into account, and achieves 100% accuracy on the synthetic dataset and on the relevant part of FraCas.

Representing Meaning with a Combination of Logical Form and Vectors

A hybrid approach that combines logic-based and distributional semantics through probabilistic logic inference in Markov Logic Networks (MLNs) is adopted, and a state-of-the-art result is achieved on the SICK dataset.

Exploring Markov Logic Networks for Question Answering

A system that reasons with knowledge derived from textbooks, represented in a subset of firstorder logic, called Praline, which demonstrates a 15% accuracy boost and a 10x reduction in runtime as compared to other MLNbased methods, and comparable accuracy to word-based baseline approaches.

Tractable Probabilistic Reasoning Through Effective Grounding

This position paper will draw attention to open research areas around efficiently instantiating templated probabilistic models.

Reasoning about Unmodelled Concepts - Incorporating Class Taxonomies in Probabilistic Relational Models

This work proposes fuzzy inference in Markov logic networks, which enables the use of taxonomic knowledge as a source of imposing structure onto possible worlds and shows that by exploiting this structure, probability distributions can be represented more compactly and that the reasoning systems become capable of reasoning about concepts not contained in the probabilistic knowledge base.

Interpretation of Natural-language Robot Instructions: Probabilistic Knowledge Representation, Learning, and Reasoning

This work presents PRAC – Probabilistic Action Cores – an interpreter for naturallanguage instructions which is able to resolve vagueness and ambiguity in natural language and infer missing information pieces that are required to render an instruction executable by a robot.

References

SHOWING 1-10 OF 21 REFERENCES

Sound and Efficient Inference with Probabilistic and Deterministic Dependencies

MC-SAT is an inference algorithm that combines ideas from MCMC and satisfiability, based on Markov logic, which defines Markov networks using weighted clauses in first-order logic and greatly outperforms Gibbs sampling and simulated tempering over a broad range of problem sizes and degrees of determinism.

Composition in Distributional Models of Semantics

This article proposes a framework for representing the meaning of word combinations in vector space in terms of additive and multiplicative functions, and introduces a wide range of composition models that are evaluated empirically on a phrase similarity task.

Lifted First-Order Belief Propagation

This paper proposes the first lifted version of a scalable probabilistic inference algorithm, belief propagation, based on first constructing a lifted network, where each node represents a set of ground atoms that all pass the same messages during belief propagation.

Probabilistic theorem proving

This work proposes the first method that has the full power of both graphical model inference and first-order theorem proving (in finite domains with Herbrand interpretations), and proposes an algorithm for approximate PTP, which shows that it is superior to lifted belief propagation.

Towards a Formal Distributional Semantics: Simulating Logical Calculi with Tensors

This paper discusses how the canonical isomorphism between tensors and multilinear maps can be exploited to simulate a full-blown quantifier-free predicate calculus using tensors.

Wide-Coverage Semantic Analysis with Boxer

Boxer is an open-domain software component for semantic analysis of text, based on Combinatory Categorial Grammar and Discourse Representation Theory, which shows that bridging references and pronouns are not resolved in most cases.

A SICK cure for the evaluation of compositional distributional semantic models

This work aims to help the research community working on compositional distributional semantic models (CDSMs) by providing SICK (Sentences Involving Compositional Knowldedge), a large size English benchmark tailored for them.

Towards Efficient Sampling: Exploiting Random Walk Strategies

It is shown that random walk SAT procedures often do reach the full set of solutions of complex logical theories and how the sampling becomes near-uniform is shown.

Markov Logic: An Interface Layer for Artificial Intelligence

Most subfields of computer science have an interface layer via which applications communicate with the infrastructure, and this is key to their success, but this interface layer has been missing in AI.

From Frequency to Meaning: Vector Space Models of Semantics

The goal in this survey is to show the breadth of applications of VSMs for semantics, to provide a new perspective on VSMs, and to provide pointers into the literature for those who are less familiar with the field.