Handling Syntactic Divergence in Low-resource Machine Translation

@inproceedings{Zhou2019HandlingSD,
  title={Handling Syntactic Divergence in Low-resource Machine Translation},
  author={Chunting Zhou and X. Ma and J. Hu and Graham Neubig},
  booktitle={EMNLP/IJCNLP},
  year={2019}
}
  • Chunting Zhou, X. Ma, +1 author Graham Neubig
  • Published in EMNLP/IJCNLP 2019
  • Computer Science
  • Despite impressive empirical successes of neural machine translation (NMT) on standard benchmarks, limited parallel data impedes the application of NMT models to many language pairs. Data augmentation methods such as back-translation make it possible to use monolingual data to help alleviate these issues, but back-translation itself fails in extreme low-resource scenarios, especially for syntactically divergent languages. In this paper, we propose a simple yet effective solution, whereby target… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore Further: Topics Discussed in This Paper

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 32 REFERENCES
    Combining Bilingual and Comparable Corpora for Low Resource Machine Translation
    58
    Semi-Supervised Learning for Neural Machine Translation
    114
    Rapid Adaptation of Neural Machine Translation to New Languages
    67
    Neural Cross-Lingual Named Entity Recognition with Minimal Resources
    64
    Improving Neural Machine Translation Models with Monolingual Data
    878
    Two New Evaluation Datasets for Low-Resource Machine Translation: Nepali-English and Sinhala-English
    56
    A Comparison of Syntactic Reordering Methods for English-German Machine Translation
    11
    Pre-Reordering for Neural Machine Translation: Helpful or Harmful?
    14