Modeling Source Syntax for Neural Machine Translation

@inproceedings{Li2017ModelingSS,
  title={Modeling Source Syntax for Neural Machine Translation},
  author={Junhui Li and Deyi Xiong and Zhaopeng Tu and Muhua Zhu and Min Zhang and Guodong Zhou},
  booktitle={ACL},
  year={2017}
}
Even though a linguistics-free sequence to sequence model in neural machine translation (NMT) has certain capability of implicitly learning syntactic information of source sentences, this paper shows that source syntax can be explicitly incorporated into NMT effectively to provide further improvements. Specifically, we linearize parse trees of source sentences to obtain structural label sequences. On the basis, we propose three different sorts of encoders to incorporate source syntax into NMT… CONTINUE READING
Highly Cited
This paper has 46 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 34 times. VIEW TWEETS

From This Paper

Figures, tables, results, and topics from this paper.

Key Quantitative Results

  • It is interesting to note that the simplest RNN encoder, i.e., Mixed RNN encoder yields the best performance with an significant improvement of 1.4 BLEU points.

Citations

Publications citing this paper.
Showing 1-10 of 35 extracted citations

Similar Papers

Loading similar papers…