Inducing Semantic Roles Without Syntax

@inproceedings{Michael2021InducingSR,
  title={Inducing Semantic Roles Without Syntax},
  author={Julian Michael and Luke Zettlemoyer},
  booktitle={FINDINGS},
  year={2021}
}
Semantic roles are a key component of linguistic predicate-argument structure, but developing ontologies of these roles requires significant expertise and manual effort. Methods exist for automatically inducing semantic roles using syntactic representations, but syntax can also be difficult to define, annotate, and predict. We show it is possible to automatically induce semantic roles from QA-SRL, a scalable and ontology-free semantic annotation scheme that uses question-answer pairs to… Expand
1 Citations
Asking It All: Generating Contextualized Questions for any Semantic Role
Asking questions about a situation is an inherent step towards understanding it. To this end, we introduce the task of role question generation, which, given a predicate mention and a passage,Expand

References

SHOWING 1-10 OF 53 REFERENCES
The Proposition Bank: An Annotated Corpus of Semantic Roles
TLDR
An automatic system for semantic role tagging trained on the corpus is described and the effect on its performance of various types of information is discussed, including a comparison of full syntactic parsing with a flat representation and the contribution of the empty trace categories of the treebank. Expand
Similarity-Driven Semantic Role Induction via Graph Partitioning
TLDR
The working hypothesis of this article is that semantic roles can be induced without human supervision from a corpus of syntactically parsed sentences based on three linguistic principles, and a method is presented that implements these principles and formalizes the task as a graph partitioning problem. Expand
A Bayesian Approach to Unsupervised Semantic Role Induction
TLDR
Two Bayesian models for unsupervised semantic role labeling (SRL) task are introduced, with the coupled model consistently outperforming the factored counterpart in all experimental set-ups. Expand
Question-Answer Driven Semantic Role Labeling: Using Natural Language to Annotate Natural Language
TLDR
The results show that non-expert annotators can produce high quality QA-SRL data, and also establish baseline performance levels for future work on this task, and introduce simple classifierbased models for predicting which questions to ask and what their answers should be. Expand
Unsupervised Induction of Semantic Roles
TLDR
A method for inducing the semantic roles of verbal arguments directly from unannotated text by detecting alternations and finding a canonical syntactic form for them in a novel probabilistic model, a latent-variable variant of the logistic classifier. Expand
QANom: Question-Answer driven SRL for Nominalizations
We propose a new semantic scheme for capturing predicate-argument relations for nominalizations, termed QANom. This scheme extends the QA-SRL formalism (He et al., 2015), modeling the relationsExpand
The CoNLL 2008 Shared Task on Joint Parsing of Syntactic and Semantic Dependencies
TLDR
This shared task not only unifies the shared tasks of the previous four years under a unique dependency-based formalism, but also extends them significantly: this year's syntactic dependencies include more information such as named-entity boundaries; the semantic dependencies model roles of both verbal and nominal predicates. Expand
Unsupervised Induction of Semantic Roles within a Reconstruction-Error Minimization Framework
TLDR
This work introduces a new approach to unsupervised estimation of feature-rich semantic role labeling models that performs on par with most accurate role induction methods on English and German, even though it does not incorporate any prior linguistic knowledge about the languages. Expand
Universal Conceptual Cognitive Annotation (UCCA)
TLDR
UCCA is presented, a novel multi-layered framework for semantic representation that aims to accommodate the semantic distinctions expressed through linguistic utterances and its relative insensitivity to meaning-preserving syntactic variation is demonstrated. Expand
Multiplicative Representations for Unsupervised Semantic Role Induction
TLDR
This work proposes a neural model to learn argument embeddings from the context by explicitly incorporating dependency relations as multiplicative factors, which bias argumentembeddings according to their dependency roles. Expand
...
1
2
3
4
5
...