Corpus ID: 108300988

What do you learn from context? Probing for sentence structure in contextualized word representations

@article{Tenney2019WhatDY,
  title={What do you learn from context? Probing for sentence structure in contextualized word representations},
  author={Ian Tenney and Patrick Xia and B. Chen and Alex Wang and Adam Poliak and R. Thomas McCoy and Najoung Kim and Benjamin Van Durme and Samuel R. Bowman and Dipanjan Das and Ellie Pavlick},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.06316}
}
  • Ian Tenney, Patrick Xia, +8 authors Ellie Pavlick
  • Published 2019
  • Computer Science
  • ArXiv
  • Contextualized representation models such as ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2018) have recently achieved state-of-the-art results on a diverse array of downstream NLP tasks. [...] Key Method We probe word-level contextual representations from four recent models and investigate how they encode sentence structure across a range of syntactic, semantic, local, and long-range phenomena. We find that existing models trained on language modeling and translation produce strong representations for…Expand Abstract
    249 Citations

    Figures, Tables, and Topics from this paper

    Context Analysis for Pre-trained Masked Language Models
    • Highly Influenced
    • PDF
    Quantifying the Contextualization of Word Representations with Semantic Class Probing
    • 2
    • Highly Influenced
    • PDF
    What does it mean to be language-agnostic? Probing multilingual sentence encoders for typological properties
    • 3
    • Highly Influenced
    • PDF
    Linguistic Knowledge and Transferability of Contextual Representations
    • 227
    • PDF
    Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation
    • 3
    • PDF
    Does Chinese BERT Encode Word Structure?
    • PDF
    Infusing Finetuning with Semantic Dependencies
    • Highly Influenced
    • PDF
    Probing What Different NLP Tasks Teach Machines about Function Word Comprehension
    • 34
    • PDF

    References

    SHOWING 1-10 OF 55 REFERENCES
    Dissecting Contextual Word Embeddings: Architecture and Representation
    • 175
    • Highly Influential
    • PDF
    Deep contextualized word representations
    • 5,022
    • Highly Influential
    • PDF
    Evaluating Compositionality in Sentence Embeddings
    • 71
    • PDF
    Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
    • 273
    • PDF
    Deep RNNs Encode Soft Hierarchical Syntax
    • 63
    • Highly Influential
    • PDF
    Linguistically-Informed Self-Attention for Semantic Role Labeling
    • 201
    • Highly Influential
    • PDF
    Learned in Translation: Contextualized Word Vectors
    • 543
    • Highly Influential
    • PDF
    Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
    • 1,111
    • Highly Influential
    • PDF