What do you learn from context? Probing for sentence structure in contextualized word representations
@article{Tenney2019WhatDY, title={What do you learn from context? Probing for sentence structure in contextualized word representations}, author={Ian Tenney and Patrick Xia and B. Chen and Alex Wang and Adam Poliak and R. Thomas McCoy and Najoung Kim and Benjamin Van Durme and Samuel R. Bowman and Dipanjan Das and Ellie Pavlick}, journal={ArXiv}, year={2019}, volume={abs/1905.06316} }
Contextualized representation models such as ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2018) have recently achieved state-of-the-art results on a diverse array of downstream NLP tasks. [...] Key Method We probe word-level contextual representations from four recent models and investigate how they encode sentence structure across a range of syntactic, semantic, local, and long-range phenomena. We find that existing models trained on language modeling and translation produce strong representations for…Expand Abstract
Supplemental Code
Figures, Tables, and Topics from this paper
249 Citations
Context Analysis for Pre-trained Masked Language Models
- Computer Science
- EMNLP
- 2020
- Highly Influenced
- PDF
Quantifying the Contextualization of Word Representations with Semantic Class Probing
- Computer Science
- EMNLP
- 2020
- 2
- Highly Influenced
- PDF
What does it mean to be language-agnostic? Probing multilingual sentence encoders for typological properties
- Computer Science
- ArXiv
- 2020
- 3
- Highly Influenced
- PDF
Linguistic Knowledge and Transferability of Contextual Representations
- Computer Science
- NAACL-HLT
- 2019
- 227
- PDF
Unsupervised Distillation of Syntactic Information from Contextualized Word Representations
- Computer Science
- BLACKBOXNLP
- 2020
- PDF
Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation
- Computer Science
- RepL4NLP@ACL
- 2020
- 3
- PDF
DL4NLP 2019 Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing
- 2019
- PDF
Probing What Different NLP Tasks Teach Machines about Function Word Comprehension
- Computer Science
- *SEM@NAACL-HLT
- 2019
- 34
- PDF
References
SHOWING 1-10 OF 55 REFERENCES
Dissecting Contextual Word Embeddings: Architecture and Representation
- Computer Science
- EMNLP
- 2018
- 175
- Highly Influential
- PDF
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
- Computer Science
- ACL
- 2018
- 325
- PDF
Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks
- Computer Science
- ICLR
- 2017
- 273
- PDF
LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better
- Computer Science
- ACL
- 2018
- 78
- PDF
Linguistically-Informed Self-Attention for Semantic Role Labeling
- Computer Science
- EMNLP
- 2018
- 201
- Highly Influential
- PDF
Supervised Learning of Universal Sentence Representations from Natural Language Inference Data
- Computer Science
- EMNLP
- 2017
- 1,111
- Highly Influential
- PDF