Recurrent Neural Network Grammars

@inproceedings{Dyer2016RecurrentNN,
  title={Recurrent Neural Network Grammars},
  author={Chris Dyer and Adhiguna Kuncoro and Miguel Ballesteros and Noah A. Smith},
  booktitle={HLT-NAACL},
  year={2016}
}
  • Chris Dyer, Adhiguna Kuncoro, +1 author Noah A. Smith
  • Published in HLT-NAACL 2016
  • Computer Science
  • We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese. 
    331 Citations
    Unsupervised Recurrent Neural Network Grammars
    • 47
    • PDF
    Parsing as Language Modeling
    • 84
    • Highly Influenced
    • PDF
    Language Modeling with Shared Grammar
    Learning Structured Natural Language Representations for Semantic Parsing
    • 53
    • PDF
    Neural Parse Combination
    What Do Recurrent Neural Network Grammars Learn About Syntax?
    • 106
    • PDF
    Towards Neural Machine Translation with Latent Tree Attention
    • 12
    • Highly Influenced
    • PDF
    Neural Language Modeling by Jointly Learning Syntax and Lexicon
    • 78
    • PDF
    Syntactic realization with data-driven neural tree grammars
    • 1
    • PDF
    Easy-First Dependency Parsing with Hierarchical Tree LSTMs
    • 55
    • PDF

    References

    SHOWING 1-10 OF 64 REFERENCES
    Generative Incremental Dependency Parsing with Neural Networks
    • 14
    • PDF
    Easy-First Dependency Parsing with Hierarchical Tree LSTMs
    • 55
    • PDF
    A Latent Variable Model for Generative Dependency Parsing
    • 102
    • PDF
    Structured language modeling
    • 303
    • PDF
    Relating Probabilistic Grammars and Automata
    • 80
    • PDF
    Recurrent Neural Network Regularization
    • 1,549
    • Highly Influential
    • PDF
    Inducing History Representations for Broad Coverage Statistical Parsing
    • 102
    • PDF
    Parsing with Compositional Vector Grammars
    • 830
    • PDF
    A Neural Syntactic Language Model
    • 77
    • PDF