Edinburgh Neural Machine Translation Systems for WMT 16

@inproceedings{Sennrich2016EdinburghNM,
  title={Edinburgh Neural Machine Translation Systems for WMT 16},
  author={Rico Sennrich and B. Haddow and Alexandra Birch},
  booktitle={WMT},
  year={2016}
}
We participated in the WMT 2016 shared news translation task by building neural translation systems for four language pairs, each trained in both directions: English Czech, English German, English Romanian and English Russian. Our systems are based on an attentional encoder-decoder, using BPE subword segmentation for open-vocabulary translation with a fixed vocabulary. We experimented with using automatic back-translations of the monolingual News corpus as additional training data, pervasive… Expand
The University of Edinburgh’s Neural MT Systems for WMT17
NRC Machine Translation System for WMT 2017
Alibaba’s Neural Machine Translation Systems for WMT18
FBK's Participation to the English-to-German News Translation Task of WMT 2017
The Helsinki Neural Machine Translation System
Our Neural Machine Translation Systems for WAT 2019
The RWTH Aachen University English-German and German-English Machine Translation System for WMT 2017
CUNI Transformer Neural MT System for WMT18
TMU Japanese-Chinese Unsupervised NMT System for WAT 2018 Translation Task
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 18 REFERENCES
Improving Neural Machine Translation Models with Monolingual Data
Neural Machine Translation of Rare Words with Subword Units
The QT21/HimL Combined Machine Translation System
Neural Machine Translation by Jointly Learning to Align and Translate
Dropout Improves Recurrent Neural Networks for Handwriting Recognition
A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
CzEng 1.6: Enlarged Czech-English Parallel Corpus with Processing Tools Dockered
On the difficulty of training recurrent neural networks
...
1
2
...