Recurrent Neural Network Grammars

@inproceedings{Dyer2016RecurrentNN,
  title={Recurrent Neural Network Grammars},
  author={Chris Dyer and Adhiguna Kuncoro and Miguel Ballesteros and Noah A. Smith},
  booktitle={HLT-NAACL},
  year={2016}
}
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese. 

Citations

Publications citing this paper.
SHOWING 1-10 OF 222 CITATIONS

Direct Output Connection for a High-Rank Language Model

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Neural Syntactic Generative Models with Exact Marginalization

VIEW 14 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

Encoder-Decoder Shift-Reduce Syntactic Parsing

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

In-Order Transition-based Constituent Parsing

VIEW 16 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Parsing as Language Modeling

VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2020

CITATION STATISTICS

  • 46 Highly Influenced Citations

  • Averaged 66 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 55 REFERENCES

Recurrent Neural Network Regularization

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

The Penn Chinese TreeBank: Phrase structure annotation of a large corpus

VIEW 1 EXCERPT
HIGHLY INFLUENTIAL

Easy-First Dependency Parsing with Hierarchical Tree LSTMs

VIEW 1 EXCERPT