Controlled Crowdsourcing for High-Quality QA-SRL Annotation

@inproceedings{Roit2020ControlledCF,
  title={Controlled Crowdsourcing for High-Quality QA-SRL Annotation},
  author={Paul Roit and Ayal Klein and Daniela Stepanov and Jonathan Mamou and Julian Michael and Gabriel Stanovsky and Luke Zettlemoyer and Ido Dagan},
  booktitle={ACL},
  year={2020}
}
Question-answer driven Semantic Role Labeling (QA-SRL) was proposed as an attractive open and natural flavour of SRL, potentially attainable from laymen. Recently, a large-scale crowdsourced QA-SRL corpus and a trained parser were released. Trying to replicate the QA-SRL annotation for new texts, we found that the resulting annotations were lacking in quality, particularly in coverage, making them insufficient for further research and evaluation. In this paper, we present an improved… Expand
8 Citations
QANom: Question-Answer driven SRL for Nominalizations
  • PDF
The Extraordinary Failure of Complement Coercion Crowdsourcing
  • 1
  • PDF
CoRefi: A Crowd Sourcing Suite for Coreference Annotation
  • 2
  • PDF
Does Putting a Linguist in the Loop Improve NLU Data Collection?
  • PDF
Teach Me to Explain: A Review of Datasets for Explainable NLP
  • 1
  • PDF

References

SHOWING 1-10 OF 15 REFERENCES
Large-Scale QA-SRL Parsing
  • 40
  • PDF
Question-Answer Driven Semantic Role Labeling: Using Natural Language to Annotate Natural Language
  • 115
  • PDF
Creating a Large Benchmark for Open Information Extraction
  • 77
  • PDF
Improving Implicit Semantic Role Labeling by Predicting Semantic Frame Arguments
  • 13
  • PDF
Unsupervised induction and filling of semantic slots for spoken dialogue systems using frame-semantic parsing
  • 72
  • PDF
Machine Comprehension with Syntax, Frames, and Semantics
  • 87
  • PDF
The Berkeley FrameNet Project
  • 2,686
  • PDF
Universal Conceptual Cognitive Annotation (UCCA)
  • 148
  • PDF
...
1
2
...