• Publications
  • Influence
Sequential Neural Networks as Automata
TLDR
This work attempts to explain the types of computation that neural networks can perform by relating them to automata, as well as the relationship between neural networks and natural language grammar. Expand
  • 24
  • 5
  • PDF
Context-Free Transductions with Neural Stacks
TLDR
This paper analyzes the behavior of stack-augmented recurrent neural network (RNN) models. Expand
  • 19
  • 3
  • PDF
End-to-end Graph-based TAG Parsing with Neural Networks
TLDR
We present a graph-based Tree Adjoining Grammar (TAG) parser that uses BiLSTMs, highway connections, and character-level CNNs. Expand
  • 10
  • 2
  • PDF
A Formal Hierarchy of RNN Architectures
TLDR
We develop a formal hierarchy of the expressive capacity of RNN architectures based on two formal properties: space complexity, which measures the RNN's memory, and rational recurrence, defined as whether the recurrent update can be described by a weighted finite-state machine. Expand
  • 13
  • PDF
Finding Syntactic Representations in Neural Stacks
TLDR
We extract syntactic trees from the pushing behavior of stack RNNs trained on language modeling and classification objectives, demonstrating that stack Rnns do indeed infer linguistically relevant hierarchical structure. Expand
Sequential Neural Networks as Automata Anonymous ACL submission
This work attempts to explain the types of computation that neural networks can perform by relating them to automata. We first define what it means for a real-time network with bounded precision toExpand
Sequential neural networks as automata
In recent years, neural network architectures for sequence modeling have been applied with great success to a variety of NLP tasks. What neural networks provide in performance, however, they lack inExpand
Detecting Syntactic Change Using a Neural Part-of-Speech Tagger
TLDR
We train a diachronic long short-term memory (LSTM) part-of-speech tagger on a large corpus of American English from the 19th, 20th, and 21st centuries. Expand
  • 1
  • PDF