Why Neural Translations are the Right Length

@inproceedings{Shi2016WhyNT,
  title={Why Neural Translations are the Right Length},
  author={Xing Shi and Kevin Knight and Deniz Yuret},
  booktitle={EMNLP},
  year={2016}
}
We investigate how neural, encoder-decoder translation systems output target strings of appropriate lengths, finding that a collection of hidden units learns to explicitly implement this functionality. 

Figures, Tables, and Topics from this paper.

Explore Further: Topics Discussed in This Paper

Similar Papers

Citations

Publications citing this paper.
SHOWING 1-10 OF 18 CITATIONS

Learning to Decode for Future Success

VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Analysis Methods in Neural Language Processing: A Survey

  • Transactions of the Association for Computational Linguistics
  • 2018
VIEW 11 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

References

Publications referenced by this paper.
SHOWING 1-10 OF 11 REFERENCES