Corpus ID: 227339801

Reciprocal Supervised Learning Improves Neural Machine Translation

@article{Xu2020ReciprocalSL,
  title={Reciprocal Supervised Learning Improves Neural Machine Translation},
  author={Minkai Xu and Mingxuan Wang and Zhouhan Lin and Hao Zhou and Weinan Zhang and Lei Li},
  journal={ArXiv},
  year={2020},
  volume={abs/2012.02975}
}
Despite the recent success on image classification, self-training has only achieved limited gains on structured prediction tasks such as neural machine translation (NMT). This is mainly due to the compositionality of the target space, where the far-away prediction hypotheses lead to the notorious reinforced mistake problem. In this paper, we revisit the utilization of multiple diverse models and present a simple yet effective approach named Reciprocal-Supervised Learning (RSL). RSL first… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 65 REFERENCES
Regularizing Neural Machine Translation by Target-bidirectional Agreement
Improving Neural Machine Translation Models with Monolingual Data
Towards Making the Most of BERT in Neural Machine Translation
Sequence to Sequence Learning with Neural Networks
Attention is All you Need
Joint Training for Neural Machine Translation Models with Monolingual Data
Dual Supervised Learning
Dual Learning for Machine Translation
...
1
2
3
4
5
...