Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism

@inproceedings{Firat2016MultiWayMN,
  title={Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism},
  author={Orhan Firat and Kyunghyun Cho and Yoshua Bengio},
  booktitle={HLT-NAACL},
  year={2016}
}
We propose multi-way, multilingual neural machine translation. The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly with the number of languages. This is made possible by having a single attention mechanism that is shared across all language pairs. We train the proposed multiway, multilingual model on ten language pairs from WMT’15 simultaneously and observe clear performance improvements… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 135 CITATIONS, ESTIMATED 57% COVERAGE

236 Citations

0501002016201720182019
Citations per Year
Semantic Scholar estimates that this publication has 236 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
SHOWING 1-10 OF 22 REFERENCES

Similar Papers

Loading similar papers…