Semantic Expressive Capacity with Bounded Memory

@inproceedings{Venant2019SemanticEC,
  title={Semantic Expressive Capacity with Bounded Memory},
  author={Antoine Venant and Alexander Koller},
  booktitle={ACL},
  year={2019}
}
We investigate the capacity of mechanisms for compositional semantic parsing to describe relations between sentences and semantic representations. We prove that in order to represent certain relations, mechanisms which are syntactically projective must be able to remember an unbounded number of locations in the semantic representations, where nonprojective mechanisms need not. This is the first result of this kind, and has consequences both for grammar-based and for neural systems. 

References

SHOWING 1-10 OF 32 REFERENCES
AMR dependency parsing with a typed semantic algebra
We present a semantic parser for Abstract Meaning Representations which learns to parse strings into tree representations of the compositional structure of an AMR graph. This allows us to useExpand
Coupling CCG and Hybrid Logic Dependency Semantics
TLDR
An alternative, dependency-based perspective on linguistic meaning is presented and formalized in terms of hybrid logic and has a rich yet perspicuous propositional ontology that enables a wide variety of semantic phenomena to be represented in a single meaning formalism. Expand
An Incremental Parser for Abstract Meaning Representation
TLDR
A transition-based parser for AMR that parses sentences left-to-right, in linear time is described and it is shown that this parser is competitive with the state of the art on the LDC2015E86 dataset and that it outperforms state-of-the-art parsers for recovering named entities and handling polarity. Expand
Recurrent Neural Network Grammars
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing andExpand
Abstract Meaning Representation for Sembanking
TLDR
A sembank of simple, whole-sentence semantic structures will spur new work in statistical natural language understanding and generation, like the Penn Treebank encouraged work on statistical parsing. Expand
An Algebra for Semantic Construction in Constraint-based Grammars
TLDR
A framework for formalizing semantic construction within grammars expressed in typed feature structure logics, including HPSG is developed, which maintains much of the desirable flexibility of unification-based approaches to composition, while constraining the allowable operations in order to capture basic generalizations and improve maintainability. Expand
Linear Logic for Meaning Assembly
TLDR
The use of linear logic as a `glue' for assembling meanings allows for a coherent treatment of the LFG requirements of completeness and coherence as well as of modification and quantification. Expand
Broad-coverage CCG Semantic Parsing with AMR
TLDR
A new model is presented that combines CCG parsing to recover compositional aspects of meaning and a factor graph to model non-compositional phenomena, such as anaphoric dependencies, which is significantly outperforming the previous state of the art. Expand
Semantic construction with graph grammars
TLDR
S-graph grammars are introduced, a new grammar formalism for computing graph-based semantic representations that uses graphs as semantic representations in a way that is consistent with more classical views on semantic construction. Expand
Algorithms for Deterministic Incremental Dependency Parsing
TLDR
This article presents a general framework for describing and analyzing algorithms for deterministic incremental dependency parsing, formalized as transition systems, and shows that all four algorithms give competitive accuracy, although the non-projective list-based algorithm generally outperforms the projective algorithms for languages with a non-negligible proportion of non- projective constructions. Expand
...
1
2
3
4
...