What Do Recurrent Neural Network Grammars Learn About Syntax?

@inproceedings{Smith2016WhatDR,
  title={What Do Recurrent Neural Network Grammars Learn About Syntax?},
  author={Noah A. Smith and Chris Dyer and Miguel Ballesteros and Graham Neubig and Lingpeng Kong and Adhiguna Kuncoro},
  booktitle={EACL},
  year={2016}
}
Recurrent neural network grammars (RNNG) are a recently proposed probabilistic generative modeling family for natural language. They show state-of-the-art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 71 CITATIONS

RNN-Based Sequence-Preserved Attention for Dependency Parsing

  • AAAI
  • 2018
VIEW 8 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Deep learning applications for transition-based dependency parsing

VIEW 7 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Neural Syntactic Generative Models with Exact Marginalization

  • NAACL-HLT
  • 2018
VIEW 8 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Scheduled Multi-Task Learning: From Syntax to Translation

  • Transactions of the Association for Computational Linguistics
  • 2018
VIEW 15 EXCERPTS
CITES RESULTS & BACKGROUND

Language Modeling with Shared Grammar

Yuyu Zhang, Le Song
  • ACL
  • 2019
VIEW 7 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Effective Subtree Encoding for Easy-First Dependency Parsing

VIEW 3 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Seq2seq Dependency Parsing

VIEW 1 EXCERPT
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 9 Highly Influenced Citations

  • Averaged 23 Citations per year from 2017 through 2019