Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models

@inproceedings{Jin2017ExploringCT,
  title={Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models},
  author={Huiming Jin and Katharina Kann},
  booktitle={SWCN@EMNLP},
  year={2017}
}
Multi-task training is an effective method to mitigate the data sparsity problem. It has recently been applied for crosslingual transfer learning for paradigm completion—the task of producing inflected forms of lemmata—with sequenceto-sequence networks. However, it is still vague how the model transfers knowledge across languages, as well as if and which information is shared. To investigate this, we propose a set of data-dependent experiments using an existing encoder-decoder recurrent neural… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 18 references