Polyglot Contextual Representations Improve Crosslingual Transfer

@inproceedings{Mulcaire2019PolyglotCR,
  title={Polyglot Contextual Representations Improve Crosslingual Transfer},
  author={Phoebe Mulcaire and Jungo Kasai and Noah A. Smith},
  booktitle={NAACL-HLT},
  year={2019}
}
We introduce Rosita, a method to produce multilingual contextual word representations by training a single language model on text from multiple languages. [...] Key Method Our method combines the advantages of contextual word representations with those of multilingual representation learning.Expand Abstract
4
Twitter Mentions

Tables and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 15 CITATIONS

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

VIEW 11 EXCERPTS
CITES RESULTS
HIGHLY INFLUENCED

MultiFiT: Efficient Multi-lingual Language Model Fine-tuning

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

75 Languages, 1 Model: Parsing Universal Dependencies Universally

VIEW 1 EXCERPT
CITES BACKGROUND

Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing

VIEW 3 EXCERPTS
CITES BACKGROUND & METHODS

Cross-Lingual Transfer Learning for Question Answering

VIEW 2 EXCERPTS
CITES BACKGROUND

Cross-lingual Structure Transfer for Relation and Event Extraction

VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 31 REFERENCES

AllenNLP: A Deep Semantic Natural Language Processing Platform

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Deep contextualized word representations

VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Universal Dependency Parsing from Scratch

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

A Survey of Cross-lingual Word Embedding Models

VIEW 1 EXCERPT