Training a code-switching language model with monolingual data

@article{Chuang2019TrainingAC,
  title={Training a code-switching language model with monolingual data},
  author={Shun-Po Chuang and Tzu-Wei Sung and Hung-yi Lee},
  journal={ArXiv},
  year={2019},
  volume={abs/1911.06003}
}
  • Shun-Po Chuang, Tzu-Wei Sung, Hung-yi Lee
  • Published 2019
  • Computer Science
  • ArXiv
  • A lack of code-switching data complicates the training of code-switching (CS) language models. We propose an approach to train such CS language models on monolingual data only. By constraining and normalizing the output projection matrix in RNN-based language models, we bring embeddings of different languages closer to each other. Numerical and visualization results show that the proposed approaches remarkably improve the performance of CS language models trained on monolingual data. The… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 21 REFERENCES

    , Ludovic Denoyer , and Hervé Jégou , “ Word translation without parallel data

    • Guillaume Lample, Alexis Conneau, Marc’Aurelio Ranzato
    • International Conference on Learning Representations ( ICLR )

    Acoustically Grounded Word Embeddings for Improved Acoustics-to-word Speech Recognition

    VIEW 1 EXCERPT