Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed

@article{Pham2019PromotingTK,
  title={Promoting the Knowledge of Source Syntax in Transformer NMT Is Not Needed},
  author={Thuong-Hai Pham and Dominik Mach{\'a}cek and Ondrej Bojar},
  journal={Computaci{\'o}n y Sistemas},
  year={2019},
  volume={23}
}
  • Thuong-Hai Pham, Dominik Machácek, Ondrej Bojar
  • Published 2019
  • Computer Science
  • Computación y Sistemas
  • The utility of linguistic annotation in neural machine translation seemed to had been established in past papers. The experiments were however limited to recurrent sequence-to-sequence architectures and relatively small data settings. We focus on the state-of-the-art Transformer model and use comparably larger corpora. Specifically, we try to promote the knowledge of source-side syntax using multi-task learning either through simple data manipulation techniques or through a dedicated model… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 49 REFERENCES
    Deep Biaffine Attention for Neural Dependency Parsing
    407
    Scheduled Multi-Task Learning: From Syntax to Translation
    38
    Attention is All you Need
    10741