Neural Machine Translation with Extended Context

@inproceedings{Tiedemann2017NeuralMT,
  title={Neural Machine Translation with Extended Context},
  author={J{\"o}rg Tiedemann and Yves Scherrer},
  booktitle={DiscoMT@EMNLP},
  year={2017}
}
We investigate the use of extended context in attention-based neural machine translation. We base our experiments on translated movie subtitles and discuss the effect of increasing the segments beyond single translation units. We study the use of extended source language context as well as bilingual context extensions. The models learn to distinguish between information from different segments and are surprisingly robust with respect to translation quality. In this pilot study, we observe… CONTINUE READING

Similar Papers

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 20 CITATIONS

The University of Helsinki submissions to the WMT19 news translation task

VIEW 3 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

NTT's Machine Translation Systems for WMT19 Robustness Task

Soichiro Murakami, Makoto Morishita, Tsutomu Hirao, Masaaki Nagata
  • 2019
VIEW 1 EXCERPT