• Publications
  • Influence
Feature-Rich Part-of-Speech Tagging with a Cyclic Dependency Network
We present a new part-of-speech tagger that demonstrates the following ideas: (i) explicit use of both preceding and following tag contexts via a dependency network representation, (ii) broad use ofExpand
  • 3,151
  • 258
  • PDF
Accurate Unlexicalized Parsing
We demonstrate that an unlexicalized PCFG can parse much more accurately than previously shown, by making use of simple, linguistically motivated state splits, which break down false independenceExpand
  • 3,192
  • 252
  • PDF
Learning Accurate, Compact, and Interpretable Tree Annotation
We present an automatic approach to tree annotation in which basic nonterminal symbols are alternately split and merged to maximize the likelihood of a training treebank. Starting with a simple X-barExpand
  • 925
  • 153
  • PDF
Corpus-Based Induction of Syntactic Structure: Models of Dependency and Constituency
We present a generative model for the unsupervised learning of dependency structures. We also describe the multiplicative combination of this dependency model with a model of linear constituency. TheExpand
  • 494
  • 90
  • PDF
Improved Inference for Unlexicalized Parsing
We present several improvements to unlexicalized parsing with hierarchically state-split PCFGs. First, we present a novel coarse-to-fine method in which a grammar’s own hierarchical projections areExpand
  • 685
  • 69
  • PDF
Alignment by Agreement
We present an unsupervised approach to symmetric word alignment in which two simple asymmetric models are trained jointly to maximize a combination of data likelihood and agreement between theExpand
  • 475
  • 66
  • PDF
Fast Exact Inference with a Factored Model for Natural Language Parsing
We present a novel generative model for natural language tree structures in which semantic (lexical dependency) and syntactic (PCFG) structures are scored with separate models. This factorizationExpand
  • 855
  • 61
  • PDF
Neural Module Networks
Visual question answering is fundamentally compositional in nature-a question like where is the dog? shares substructure with questions like what color is the dog? and where is the cat? This paperExpand
  • 492
  • 58
  • PDF
Painless Unsupervised Learning with Features
We show how features can easily be added to standard generative models for unsupervised learning, without requiring complex new training methods. In particular, each component multinomial of aExpand
  • 223
  • 53
  • PDF
Learning Dependency-Based Compositional Semantics
Suppose we want to build a system that answers a natural language question by representing its semantics as a logical forxm and computing the answer given a structured database of facts. The coreExpand
  • 505
  • 51
  • PDF