Corpus ID: 32650334

English-Japanese Neural Machine Translation with Encoder-Decoder-Reconstructor

  title={English-Japanese Neural Machine Translation with Encoder-Decoder-Reconstructor},
  author={Yukio Matsumura and T. Sato and Mamoru Komachi},
  • Yukio Matsumura, T. Sato, Mamoru Komachi
  • Published 2017
  • Computer Science
  • ArXiv
  • Neural machine translation (NMT) has recently become popular in the field of machine translation. However, NMT suffers from the problem of repeating or missing words in the translation. To address this problem, Tu et al. (2017) proposed an encoder-decoder-reconstructor framework for NMT using back-translation. In this method, they selected the best forward translation model in the same manner as Bahdanau et al. (2015), and then trained a bi-directional translation model as fine-tuning. Their… CONTINUE READING

    Figures, Tables, and Topics from this paper.


    Publications referenced by this paper.
    Neural Machine Translation by Jointly Learning to Align and Translate
    • 12,869
    • Highly Influential
    • PDF
    Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
    • 9,484
    • PDF
    Neural Machine Translation with Reconstruction
    • 139
    • Highly Influential
    • PDF
    Modeling Coverage for Neural Machine Translation
    • 456
    • PDF
    Interactive Attention for Neural Machine Translation
    • 53
    • PDF
    Pre-Translation for Neural Machine Translation
    • 64
    • PDF
    Coverage Embedding Models for Neural Machine Translation
    • 100
    • Highly Influential
    • PDF
    Improving Attention Modeling with Implicit Distortion and Fertility for Machine Translation
    • 24
    • Highly Influential
    • PDF
    Overview of the Patent Machine Translation Task at the NTCIR-10 Workshop
    • 174
    • PDF
    Minimum Risk Training for Neural Machine Translation
    • 290
    • PDF