Best Practices for Learning Domain-Specific Cross-Lingual Embeddings

@article{Shakurova2019BestPF,
  title={Best Practices for Learning Domain-Specific Cross-Lingual Embeddings},
  author={L. Shakurova and Beata Nyari and Chao Li and M. Rotaru},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.03112}
}
  • L. Shakurova, Beata Nyari, +1 author M. Rotaru
  • Published 2019
  • Computer Science
  • ArXiv
  • Cross-lingual embeddings aim to represent words in multiple languages in a shared vector space by capturing semantic similarities across languages. They are a crucial component for scaling tasks to multiple languages by transferring knowledge from languages with rich resources to low-resource languages. A common approach to learning cross-lingual embeddings is to train monolingual embeddings separately for each language and learn a linear projection from the monolingual spaces into a shared… CONTINUE READING

    Tables and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 23 REFERENCES
    On the Role of Seed Lexicons in Learning Bilingual Word Embeddings
    78
    Cross-lingual Dependency Parsing Based on Distributed Representations
    135
    A survey of cross-lingual embedding models
    81
    Learning principled bilingual mappings of word embeddings while preserving monolingual invariance
    194
    Inducing Crosslingual Distributed Representations of Words
    312
    Offline bilingual word vectors, orthogonal transformations and the inverted softmax
    301