Deep contextualized word representations

@inproceedings{Peters2018DeepCW,
  title={Deep contextualized word representations},
  author={Matthew E. Peters and Mark Neumann and Mohit Iyyer and Matt Gardner and Christopher Clark and Kenton Lee and Luke S. Zettlemoyer},
  booktitle={NAACL-HLT},
  year={2018}
}
We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pretrained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 535 CITATIONS, ESTIMATED 41% COVERAGE

FILTER CITATIONS BY YEAR

2018
2019

CITATION STATISTICS

  • 167 Highly Influenced Citations

  • Averaged 308 Citations per year over the last 3 years

References

Publications referenced by this paper.
SHOWING 1-10 OF 61 REFERENCES

Similar Papers

Loading similar papers…