What do RNN Language Models Learn about Filler-Gap Dependencies?

@inproceedings{Wilcox2018WhatDR,
  title={What do RNN Language Models Learn about Filler-Gap Dependencies?},
  author={E. Wilcox and R. Levy and T. Morita and Richard Futrell},
  booktitle={BlackboxNLP@EMNLP},
  year={2018}
}
  • E. Wilcox, R. Levy, +1 author Richard Futrell
  • Published in BlackboxNLP@EMNLP 2018
  • Computer Science
  • RNN language models have achieved state-of-the-art perplexity results and have proven useful in a suite of NLP tasks, but it is as yet unclear what syntactic generalizations they learn. Here we investigate whether state-of-the-art RNN language models represent long-distance filler-gap dependencies and constraints on them. Examining RNN behavior on experimentally controlled sentences designed to expose filler-gap dependencies, we show that RNNs can represent the relationship in multiple… CONTINUE READING
    60 Citations

    Figures and Topics from this paper

    What Syntactic Structures block Dependencies in RNN Language Models?
    • 13
    • PDF
    What Don’t RNN Language Models Learn About Filler-Gap Dependencies?
    • 7
    • Highly Influenced
    • PDF
    Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
    • 19
    • PDF
    Filler-gaps that neural networks fail to generalize
    • Highly Influenced
    • PDF
    The Ability of L2 LSTM Language Models to Learn the Filler-Gap Dependency
    • PDF
    Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study
    • 6
    • PDF
    Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
    • 10
    • PDF
    Assessing the ability of Transformer-based Neural Models to represent structurally unbounded dependencies
    • 4
    • Highly Influenced
    • PDF
    Linguistic Knowledge and Transferability of Contextual Representations
    • 227
    • PDF
    Do RNNs learn human-like abstract word order preferences?
    • 9
    • PDF

    References

    SHOWING 1-10 OF 35 REFERENCES
    Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
    • 427
    • PDF
    Colorless green recurrent networks dream hierarchically
    • 216
    • PDF
    Exploring the Limits of Language Modeling
    • 782
    • PDF
    One billion word benchmark for measuring progress in statistical language modeling
    • 729
    • PDF
    Grammar as a Foreign Language
    • 763
    • PDF