Dissecting Contextual Word Embeddings: Architecture and Representation

@inproceedings{Peters2018DissectingCW,
  title={Dissecting Contextual Word Embeddings: Architecture and Representation},
  author={Matthew E. Peters and Mark Neumann and Luke S. Zettlemoyer and Wen-tau Yih},
  booktitle={EMNLP},
  year={2018}
}
Contextual word representations derived from pre-trained bidirectional language models (biLMs) have recently been shown to provide significant improvements to the state of the art for a wide range of NLP tasks. However, many questions remain as to how and why these models are so effective. In this paper, we present a detailed empirical study of how the choice of neural architecture (e.g. LSTM, CNN, or self attention) influences both end task accuracy and qualitative properties of the… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 52 CITATIONS

Casting Light on Invisible Cities: Computationally Engaging with Literary Criticism

  • NAACL-HLT
  • 2019
VIEW 11 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Designing and Interpreting Probes with Control Tasks

  • IJCNLP 2019
  • 2019
VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Efficient Contextual Representation Learning With Continuous Outputs

  • Transactions of the Association for Computational Linguistics
  • 2019
VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

LEARN FROM CONTEXT ? P ROBING FOR SENTENCE STRUCTURE IN CONTEXTUALIZED WORD REPRESENTATIONS

VIEW 13 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

Language Models as Knowledge Bases?

  • IJCNLP 2019
  • 2019
VIEW 15 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Shallow Syntax in Deep Water

VIEW 9 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2018
2019

CITATION STATISTICS

  • 29 Highly Influenced Citations

  • Averaged 26 Citations per year from 2018 through 2019

References

Publications referenced by this paper.