Polyglot Contextual Representations Improve Crosslingual Transfer

@article{Mulcaire2019PolyglotCR,
  title={Polyglot Contextual Representations Improve Crosslingual Transfer},
  author={Phoebe Mulcaire and Jungo Kasai and Noah A. Smith},
  journal={ArXiv},
  year={2019},
  volume={abs/1902.09697}
}
We introduce a method to produce multilingual contextual word representations by training a single language model on text from multiple languages. Our method combines the advantages of contextual word representations with those of multilingual representation learning. We produce language models from dissimilar language pairs (English/Arabic and English/Chinese) and use them in dependency parsing, semantic role labeling, and named entity recognition, with comparisons to monolingual and non… CONTINUE READING
14
Twitter Mentions

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 31 REFERENCES

Deep contextualized word representations

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Similar Papers

Loading similar papers…