Modeling Source Syntax for Neural Machine Translation

@inproceedings{Li2017ModelingSS,
  title={Modeling Source Syntax for Neural Machine Translation},
  author={Junhui Li and Deyi Xiong and Zhaopeng Tu and Muhua Zhu and Min Zhang and Guodong Zhou},
  booktitle={ACL},
  year={2017}
}
  • Junhui Li, Deyi Xiong, +3 authors Guodong Zhou
  • Published in ACL 2017
  • Computer Science
  • Even though a linguistics-free sequence to sequence model in neural machine translation (NMT) has certain capability of implicitly learning syntactic information of source sentences, this paper shows that source syntax can be explicitly incorporated into NMT effectively to provide further improvements. Specifically, we linearize parse trees of source sentences to obtain structural label sequences. On the basis, we propose three different sorts of encoders to incorporate source syntax into NMT… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 68 CITATIONS

    Explicitly Modeling Word Translations in Neural Machine Translation

    VIEW 9 EXCERPTS
    CITES METHODS, BACKGROUND & RESULTS

    Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations

    VIEW 16 EXCERPTS
    CITES RESULTS & METHODS
    HIGHLY INFLUENCED

    Forest-Based Neural Machine Translation

    VIEW 8 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Syntax-aware Transformer Encoder for Neural Machine Translation

    VIEW 8 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Learning to Remember Translation History with a Continuous Cache

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Multi-Source Syntactic Neural Machine Translation

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Structure-Infused Copy Mechanisms for Abstractive Summarization

    VIEW 3 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Joint Parsing and Generation for Abstractive Summarization

    VIEW 2 EXCERPTS
    CITES BACKGROUND & METHODS

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 5 Highly Influenced Citations

    • Averaged 22 Citations per year from 2017 through 2019

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 34 REFERENCES

    Neural Machine Translation by Jointly Learning to Align and Translate

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    Grammar as a Foreign Language

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Context Gates for Neural Machine Translation

    VIEW 2 EXCERPTS