Combining Event Semantics and Degree Semantics for Natural Language Inference

@inproceedings{Haruta2020CombiningES,
  title={Combining Event Semantics and Degree Semantics for Natural Language Inference},
  author={Izumi Haruta and Koji Mineshima and Daisuke Bekki},
  booktitle={International Conference on Computational Linguistics},
  year={2020}
}
In formal semantics, there are two well-developed semantic frameworks: event semantics, which treats verbs and adverbial modifiers using the notion of event, and degree semantics, which analyzes adjectives and comparatives using the notion of degree. However, it is not obvious whether these frameworks can be combined to handle cases in which the phenomena in question are interacting with each other. Here, we study this issue by focusing on natural language inference (NLI). We implement a logic… 

Tables from this paper

Logical Inference for Counting on Semi-structured Tables

This work proposes a logical inference system for reasoning between semi-structured tables and texts and shows that this system can more robustly perform inference between Tables and texts that requires numerical understanding compared with current neural approaches.

NeuralLog: Natural Language Inference with Joint Neural and Logical Reasoning

This work proposes an inference framework called NeuralLog, which utilizes both a monotonicity-based logical inference engine and a neural network language model for phrase alignment, and shows that the joint logic and neural inference system improves accuracy on the NLI task and can achieve state-of-art accuracy onThe SICK and MED datasets.

A (Mostly) Symbolic System for Monotonic Inference with Unscoped Episodic Logical Forms

Empirical evidence is given for prior claims that ULF is an appropriate representation to mediate natural logic-like inferences and its capacity to handle a variety of challenging semantic phenomena using the FraCaS dataset is demonstrated.

Monotonicity Marking from Universal Dependency Trees

This paper presents a system that automatically annotates monotonicity information based on Universal Dependency parse trees, which utilizes surface-level monotonicism facts about quantifiers, lexical items, and token-level polarity information.

References

SHOWING 1-10 OF 36 REFERENCES

Logical Inferences with Comparatives and Generalized Quantifiers

This paper presents a compositional semantics that maps various comparative constructions in English to semantic representations via Combinatory Categorial Grammar parsers and combines it with an inference system based on automated theorem proving that outperforms previous logic-based systems as well as recent deep learning-based models.

A SICK cure for the evaluation of compositional distributional semantic models

This work aims to help the research community working on compositional distributional semantic models (CDSMs) by providing SICK (Sentences Involving Compositional Knowldedge), a large size English benchmark tailored for them.

Right for the Wrong Reasons: Diagnosing Syntactic Heuristics in Natural Language Inference

There is substantial room for improvement in NLI systems, and the HANS dataset can motivate and measure progress in this area, which contains many examples where the heuristics fail.

Events in the Semantics of English: A Study in Subatomic Semantics

Focusing on the structure of meaning in English sentences at a "subatomic" level - that is, a level below the one most theories accept as basic or "atomic" - Parsons asserts that the semantics of simple English sentences require logical forms somewhat more complex than is normally assumed in natural language semantics.

On-demand Injection of Lexical Knowledge for Recognising Textual Entailment

This work approaches the recognition of textual entailment using logical semantic representations and a theorem prover, producing a system that outperforms other logic-based systems and is competitive with state-of-the-art statistical methods.

Natural Solution to FraCaS Entailment Problems

This work employs the tableau theorem prover for natural language to solve the FraCaS problems in a natural way and demonstrates state-of-the-art competence over certain sections ofFraCaS.

AllenNLP: A Deep Semantic Natural Language Processing Platform

AllenNLP is described, a library for applying deep learning methods to NLP research that addresses issues with easy-to-use command-line tools, declarative configuration-driven experiments, and modular NLP abstractions.

VerbOcean: Mining the Web for Fine-Grained Semantic Verb Relations

A semi-automatic method for extracting fine-grained semantic relations between verbs using lexicosyntactic patterns over the Web, which detects similarity, strength, antonymy, enablement, and temporal happens-before relations between pairs of strongly associated verbs.

Higher-order logical inference with compositional semantics

The results show that a system based on a reasonably-sized semantic lexicon and a manageable number of non-first-order axioms enables efficient logical inferences, including those concerned with generalized quantifiers and intensional operators, and outperforms the state-of-the-art firstorder inference system.

Wide-Coverage Semantic Analysis with Boxer

Boxer is an open-domain software component for semantic analysis of text, based on Combinatory Categorial Grammar and Discourse Representation Theory, which shows that bridging references and pronouns are not resolved in most cases.