Automatically Extracting Challenge Sets for Non local Phenomena in Neural Machine Translation

@article{Choshen2019AutomaticallyEC,
  title={Automatically Extracting Challenge Sets for Non local Phenomena in Neural Machine Translation},
  author={Leshem Choshen and Omri Abend},
  journal={ArXiv},
  year={2019},
  volume={abs/1909.06814}
}
We show that the state-of-the-art Transformer MT model is not biased towards monotonic reordering (unlike previous recurrent neural network models), but that nevertheless, long-distance dependencies remain a challenge for the model. Since most dependencies are short-distance, common evaluation metrics will be little influenced by how well systems perform on them. We, therefore, propose an automatic approach for extracting challenge sets replete with long-distance dependencies, and argue that… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 70 REFERENCES

Attention is All you Need

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

  • Transactions of the Association for Computational Linguistics
  • 2016
VIEW 11 EXCERPTS
HIGHLY INFLUENTIAL