Inducing Semantic Roles Without Syntax

@inproceedings{Michael2021InducingSR,
  title={Inducing Semantic Roles Without Syntax},
  author={Julian Michael and Luke Zettlemoyer},
  booktitle={FINDINGS},
  year={2021}
}
Semantic roles are a key component of linguistic predicate-argument structure, but developing ontologies of these roles requires significant expertise and manual effort. Methods exist for automatically inducing semantic roles using syntactic representations, but syntax can also be difficult to define, annotate, and predict. We show it is possible to automatically induce semantic roles from QA-SRL, a scalable and ontology-free semantic annotation scheme that uses question-answer pairs to… 
2 Citations

Asking It All: Generating Contextualized Questions for any Semantic Role

TLDR
The task of role question generation is introduced, which requires producing a set of questions asking about all possible semantic roles of the predicate, and a two-stage model is developed, which first produces a context-independent question prototype for each role and then revises it to be contextually appropriate for the passage.

Unsupervised Slot Schema Induction for Task-oriented Dialog

TLDR
This work proposes an unsupervised approach for slot schema induction from unlabeled dialog corpora, Leveraging in-domain language models and unsuper supervised parsing structures, and extracts candidate slots without constraints, followed by coarse-to-fine clustering to induce slot types.

References

SHOWING 1-10 OF 53 REFERENCES

The Proposition Bank: An Annotated Corpus of Semantic Roles

TLDR
An automatic system for semantic role tagging trained on the corpus is described and the effect on its performance of various types of information is discussed, including a comparison of full syntactic parsing with a flat representation and the contribution of the empty trace categories of the treebank.

Similarity-Driven Semantic Role Induction via Graph Partitioning

TLDR
The working hypothesis of this article is that semantic roles can be induced without human supervision from a corpus of syntactically parsed sentences based on three linguistic principles, and a method is presented that implements these principles and formalizes the task as a graph partitioning problem.

A Bayesian Approach to Unsupervised Semantic Role Induction

TLDR
Two Bayesian models for unsupervised semantic role labeling (SRL) task are introduced, with the coupled model consistently outperforming the factored counterpart in all experimental set-ups.

Question-Answer Driven Semantic Role Labeling: Using Natural Language to Annotate Natural Language

TLDR
The results show that non-expert annotators can produce high quality QA-SRL data, and also establish baseline performance levels for future work on this task, and introduce simple classifierbased models for predicting which questions to ask and what their answers should be.

Unsupervised Induction of Semantic Roles

TLDR
A method for inducing the semantic roles of verbal arguments directly from unannotated text by detecting alternations and finding a canonical syntactic form for them in a novel probabilistic model, a latent-variable variant of the logistic classifier.

QANom: Question-Answer driven SRL for Nominalizations

We propose a new semantic scheme for capturing predicate-argument relations for nominalizations, termed QANom. This scheme extends the QA-SRL formalism (He et al., 2015), modeling the relations

The CoNLL 2008 Shared Task on Joint Parsing of Syntactic and Semantic Dependencies

TLDR
This shared task not only unifies the shared tasks of the previous four years under a unique dependency-based formalism, but also extends them significantly: this year's syntactic dependencies include more information such as named-entity boundaries; the semantic dependencies model roles of both verbal and nominal predicates.

Unsupervised Induction of Semantic Roles within a Reconstruction-Error Minimization Framework

TLDR
This work introduces a new approach to unsupervised estimation of feature-rich semantic role labeling models that performs on par with most accurate role induction methods on English and German, even though it does not incorporate any prior linguistic knowledge about the languages.

Universal Conceptual Cognitive Annotation (UCCA)

TLDR
UCCA is presented, a novel multi-layered framework for semantic representation that aims to accommodate the semantic distinctions expressed through linguistic utterances and its relative insensitivity to meaning-preserving syntactic variation is demonstrated.

Multiplicative Representations for Unsupervised Semantic Role Induction

TLDR
This work proposes a neural model to learn argument embeddings from the context by explicitly incorporating dependency relations as multiplicative factors, which bias argumentembeddings according to their dependency roles.
...