Improving Neural Machine Translation Models with Monolingual Data

@article{Sennrich2016ImprovingNM,
  title={Improving Neural Machine Translation Models with Monolingual Data},
  author={Rico Sennrich and B. Haddow and Alexandra Birch},
  journal={ArXiv},
  year={2016},
  volume={abs/1511.06709}
}
Neural Machine Translation (NMT) has obtained state-of-the art performance for several language pairs, while only using parallel data for training. Target-side monolingual data plays an important role in boosting fluency for phrase-based statistical machine translation, and we investigate the use of monolingual data for NMT. In contrast to previous work, which combines NMT models with separately trained language models, we note that encoder-decoder NMT architectures already have the capacity to… Expand
Joint Training for Neural Machine Translation Models with Monolingual Data
Semi-Supervised Learning for Neural Machine Translation
Multi-task Learning for Multilingual Neural Machine Translation
Using Target-side Monolingual Data for Neural Machine Translation through Multi-task Learning
Exploiting Monolingual Data at Scale for Neural Machine Translation
Can Monolingual Embeddings Improve Neural Machine Translation?
Exploiting Source-side Monolingual Data in Neural Machine Translation
Improving Neural Machine Translation with Pre-trained Representation
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 38 REFERENCES
On Using Monolingual Corpora in Neural Machine Translation
Domain Adaptation for Statistical Machine Translation with Monolingual Resources
Stanford Neural Machine Translation Systems for Spoken Language Domains
Neural Machine Translation by Jointly Learning to Align and Translate
On Using Very Large Target Vocabulary for Neural Machine Translation
Investigations on large-scale lightly-supervised training for statistical machine translation
Sequence to Sequence Learning with Neural Networks
...
1
2
3
4
...