Corpus ID: 227339801

Reciprocal Supervised Learning Improves Neural Machine Translation

@article{Xu2020ReciprocalSL,
  title={Reciprocal Supervised Learning Improves Neural Machine Translation},
  author={Minkai Xu and Mingxuan Wang and Zhouhan Lin and Hao Zhou and W. Zhang and Lei Li},
  journal={ArXiv},
  year={2020},
  volume={abs/2012.02975}
}
Despite the recent success on image classification, self-training has only achieved limited gains on structured prediction tasks such as neural machine translation (NMT). This is mainly due to the compositionality of the target space, where the far-away prediction hypotheses lead to the notorious reinforced mistake problem. In this paper, we revisit the utilization of multiple diverse models and present a simple yet effective approach named Reciprocal-Supervised Learning (RSL). RSL first… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 65 REFERENCES
Regularizing Neural Machine Translation by Target-bidirectional Agreement
  • 33
  • PDF
Improving Neural Machine Translation Models with Monolingual Data
  • 1,220
  • PDF
Towards Making the Most of BERT in Neural Machine Translation
  • 36
  • PDF
Sequence to Sequence Learning with Neural Networks
  • 12,190
  • Highly Influential
  • PDF
Attention is All you Need
  • 18,211
  • Highly Influential
  • PDF
Joint Training for Neural Machine Translation Models with Monolingual Data
  • 65
  • PDF
Dual Supervised Learning
  • 80
  • PDF
Dual Learning for Machine Translation
  • 498
  • PDF
...
1
2
3
4
5
...