Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics

@inproceedings{Hershcovich2020ComparisonBC,
  title={Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics},
  author={Daniel Hershcovich and Nathan Schneider and Dotan Dvir and Jakob Prange and Miryam de Lhoneux and Omri Abend},
  booktitle={COLING},
  year={2020}
}
Building robust natural language understanding systems will require a clear characterization of whether and how various linguistic meaning representations complement each other. To perform a systematic comparative analysis, we evaluate the mapping between meaning representations from different frameworks using two complementary methods: (i) a rule-based converter, and (ii) a supervised delexicalized parser that parses to one framework using only information from the other as features. We apply… 

Figures and Tables from this paper

Cross-lingual Semantic Representation for NLP with UCCA
This is an introductory tutorial to UCCA (Universal Conceptual Cognitive Annotation), a cross-linguistically applicable framework for semantic representation, with corpora annotated in English,
Oracle Linguistic Graphs Complement a Pretrained Transformer Language Model: A Cross-formalism Comparison
TLDR
It is found that, overall, semantic constituency structures are most useful to language modeling performance—outpacing syntactic constituency structures as well as syntactic and semantic dependency structures.
It's the Meaning That Counts: The State of the Art in NLP and Semantics
TLDR
This work reviews the state of computational semantics in NLP and investigates how different lines of inquiry reflect distinct understandings of semantics and prioritize different layers of linguistic meaning.
Linguistic Frameworks Go Toe-to-Toe at Neuro-Symbolic Language Modeling
We examine the extent to which, in principle, different syntactic and semantic graph representations can complement and improve neural language modeling. Specifically, by conditioning on a subgraph

References

SHOWING 1-10 OF 56 REFERENCES
Conceptual Annotations Preserve Structure Across Translations: A French-English Case Study
TLDR
It is shown that UCCA can be used to annotate French, through a systematic type-level analysis of the major French grammatical phenomena, and results show a high degree of stability across translations, supporting the usage of semantic annotations over syntactic ones in structure-aware MT systems.
Universal Conceptual Cognitive Annotation (UCCA)
TLDR
UCCA is presented, a novel multi-layered framework for semantic representation that aims to accommodate the semantic distinctions expressed through linguistic utterances and its relative insensitivity to meaning-preserving syntactic variation is demonstrated.
Cross-lingual Decompositional Semantic Parsing
TLDR
A form of decompositional semantic analysis designed to allow systems to target varying levels of structural complexity (shallow to deep analysis), an evaluation metric to measure the similarity between system output and reference semantic analysis, and an end-to-end model with a novel annotating mechanism that supports intra-sentential coreference are presented.
The Parallel Meaning Bank: Towards a Multilingual Corpus of Translations Annotated with Compositional Meaning Representations
TLDR
The approach is based on cross-lingual projection: automatically produced (and manually corrected) semantic annotations for English sentences are mapped onto their word-aligned translations, assuming that the translations are meaning-preserving.
HUME: Human UCCA-Based Evaluation of Machine Translation
TLDR
A semantics-based evaluation of machine translation quality is argued for, which captures what meaning components are retained in the MT output, thus providing a more fine-grained analysis of translation quality, and enabling the construction and tuning of semantics- based MT.
Content Differences in Syntactic and Semantic Representation
TLDR
The substantial differences between the schemes suggest that semantic parsers are likely to benefit downstream text understanding applications beyond their syntactic counterpart, as well as highlighting both challenges and potential sources for improvement.
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection
Universal Dependencies is an open community effort to create cross-linguistically consistent treebank annotation for many languages within a dependency-based lexicalist framework. The annotation
A Transition-Based Directed Acyclic Graph Parser for UCCA
TLDR
This work presents the first parser for UCCA, a cross-linguistically applicable framework for semantic representation, which builds on extensive typological work and supports rapid annotation and its ability to handle more general graph structures can inform the development of parsers for other semantic DAG structures, and in languages that frequently use discontinuous structures.
The Proposition Bank: An Annotated Corpus of Semantic Roles
TLDR
An automatic system for semantic role tagging trained on the corpus is described and the effect on its performance of various types of information is discussed, including a comparison of full syntactic parsing with a flat representation and the contribution of the empty trace categories of the treebank.
HIT-SCIR at MRP 2019: A Unified Pipeline for Meaning Representation Parsing via Efficient Training and Effective Encoding
TLDR
Generally, this paper proposed a unified pipeline to meaning representation parsing, including framework-specific transition-based parsers, BERT-enhanced word representation, and post-processing, and especially ranked first in UCCA framework.
...
1
2
3
4
5
...