• Publications
  • Influence
DyNet: The Dynamic Neural Network Toolkit
TLDR
We describe DyNet, a toolkit for implementing neural network models based on dynamic declaration of network structure. Expand
  • 317
  • 27
  • PDF
A Dependency Parser for Tweets
TLDR
We describe TWEEBOPARSER, a dependency parser for English tweets that achieves over 80% unlabeled attachment score on a new, high-quality test set. Expand
  • 191
  • 26
  • PDF
What Do Recurrent Neural Network Grammars Learn About Syntax?
TLDR
We investigate what recurrent neural network grammars learn about syntax, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism to enable closer inspection. Expand
  • 106
  • 12
  • PDF
Segmental Recurrent Neural Networks
TLDR
We introduce segmental recurrent neural networks (SRNNs) which define, given an input sequence, a joint probability distribution over segmentations of the input and labelings of the segments. Expand
  • 94
  • 9
  • PDF
Episodic Memory in Lifelong Language Learning
TLDR
We propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in a lifelong learning setup where a model needs to learn from a stream of text examples without any dataset identifier. Expand
  • 27
  • 7
  • PDF
Segmental Recurrent Neural Networks for End-to-End Speech Recognition
TLDR
We study the segmental recurrent neural network for end-to-end acoustic modelling in the context of speech recognition. Expand
  • 64
  • 6
  • PDF
Document Context Language Models
TLDR
We present and empirically evaluate a set of multi-level recurrent neural network language models, called Document-Context Language Models (DCLM), which incorporate contextual information both within and beyond the sentence. Expand
  • 58
  • 6
  • PDF
Learning and Evaluating General Linguistic Intelligence
TLDR
We define general linguistic intelligence as the ability to reuse previously acquired knowledge about a language's lexicon, syntax, semantics, and pragmatic conventions to adapt to new tasks. Expand
  • 70
  • 4
  • PDF
SyntaxNet Models for the CoNLL 2017 Shared Task
TLDR
We describe a baseline dependency parsing system for the CoNLL2017 Shared Task. Expand
  • 27
  • 4
  • PDF
DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks
TLDR
We propose a modular framework for constructing new recurrent neural architectures that generalizes the encoder/decoder concept to include explicit structure. Expand
  • 29
  • 4
  • PDF