• Publications
  • Influence
Learning Accurate, Compact, and Interpretable Tree Annotation
TLDR
We present an automatic approach to tree annotation in which basic nonterminal symbols are alternately split and merged to maximize the likelihood of a training treebank. Expand
Universal Dependencies v1: A Multilingual Treebank Collection
TLDR
We present version 1 of the universal guidelines, the underlying design principles, and the currently available treebanks for 33 languages. Expand
A Universal Part-of-Speech Tagset
TLDR
To facilitate future research in unsupervised induction of syntactic structure and to standardize best-practices, we propose a tagset that consists of twelve universal part-of-speech categories. Expand
Grammar as a Foreign Language
TLDR
We show that the domain agnostic attention-enhanced sequence-to-sequence model achieves state-of-the-art results on the most widely used syntactic constituency parsing dataset, when trained on a large synthetic corpus that was annotated using existing parsers. Expand
Globally Normalized Transition-Based Neural Networks
TLDR
We introduce a globally normalized transition-based neural network model that achieves state- of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Expand
Natural Questions: A Benchmark for Question Answering Research
TLDR
We present the Natural Questions corpus, a question answering data set. Expand
Improved Inference for Unlexicalized Parsing
TLDR
We present several improvements to unlexicalized parsing with hierarchically split PCFGs. Expand
Universal Dependency Annotation for Multilingual Parsing
TLDR
We present a new collection of treebanks with homogeneous syntactic dependency annotation for six languages: German, English, Swedish, Spanish, French and Korean. Expand
Multi-Source Transfer of Delexicalized Dependency Parsers
TLDR
We present a simple method for transferring dependency parsers from source languages with labeled training data to target languages without labeled data. Expand
Structured Training for Neural Network Transition-Based Parsing
TLDR
In this work, we combine the representational power of neural networks with the superior search enabled by structured training and inference, making our parser one of the most accurate dependency parsing to date. Expand
...
1
2
3
4
5
...