• Publications
  • Influence
Lexical Semantics
  • 989
  • 105
  • PDF
Three New Probabilistic Models for Dependency Parsing: An Exploration
TLDR
This material is based upon work supported under a National Science Foundation Graduate Fellowship, and has benefited greatly from discussions with Mike Collins, Dan Melamed, Mitch Marcus and Adwait Ratnaparkhi. Expand
  • 698
  • 72
  • PDF
The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process
TLDR
We model streams of discrete events in continuous time, by constructing a neurally self-modulating multivariate point process in which the intensities of multiple event types evolve according to a novel continuous-time LSTM. Expand
  • 214
  • 35
  • PDF
Using "Annotator Rationales" to Improve Machine Learning for Text Categorization
We propose a new framework for supervised machine learning. Our goal is to learn from smaller amounts of supervised training data, by collecting a richer kind of training data: annotations withExpand
  • 199
  • 32
  • PDF
Contrastive Estimation: Training Log-Linear Models on Unlabeled Data
TLDR
This work was supported by a Fannie and John Hertz Foundation fellowship to the first author and NSF ITR grant IIS0313193 to the second author. Expand
  • 357
  • 28
  • PDF
The SIGMORPHON 2016 Shared Task - Morphological Reinflection
TLDR
The 2016 SIGMORPHON Shared Task was devoted to the problem of morphological reinflection. Expand
  • 173
  • 26
  • PDF
Latent-Variable Modeling of String Transductions with Finite-State Methods
TLDR
We present a conditional loglinear model for string-to-string transduction, which employs overlapping features over latent alignment sequences, and which learns latent classes and latent string pair regions from incomplete training data. Expand
  • 102
  • 16
  • PDF
Learning to Search in Branch and Bound Algorithms
TLDR
We address the key challenge of learning an adaptive node searching order for any class of problem solvable by branch-and-bound. Expand
  • 99
  • 15
  • PDF
Efficient Generation in Primitive Optimality Theory
TLDR
This paper introduces primitive Optimality Theory (OTP), a linguistically motivated formalization of OT. Expand
  • 121
  • 15
  • PDF
Learning Non-Isomorphic Tree Mappings for Machine Translation
TLDR
We reformulate TSG to permit dependency trees, and sketch EM/Viterbi algorithms for alignment, training, and decoding. Expand
  • 281
  • 14
  • PDF
...
1
2
3
4
5
...