Why Neural Translations are the Right Length

@inproceedings{Shi2016WhyNT,
  title={Why Neural Translations are the Right Length},
  author={Xing Shi and Kevin Knight and Deniz Yuret},
  booktitle={EMNLP},
  year={2016}
}
We investigate how neural, encoder-decoder translation systems output target strings of appropriate lengths, finding that a collection of hidden units learns to explicitly implement this functionality. 

Figures, Tables, and Topics from this paper

Neural Machine Translation
Why Neural Machine Translation Prefers Empty Outputs
Curriculum Learning and Minibatch Bucketing in Neural Machine Translation
Learning to Decode for Future Success
Integration of Dubbing Constraints into Machine Translation
Why Do Neural Dialog Systems Generate Short and Meaningless Replies? a Comparison between Dialog and Translation
  • Bolin Wei, S. Lu, +4 authors Zhi Jin
  • Computer Science
  • ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2019
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 11 REFERENCES
A connectionist approach to machine translation
Neural Machine Translation by Jointly Learning to Align and Translate
Effective Approaches to Attention-based Neural Machine Translation
Asynchronous translations with recurrent neural nets
  • R. Ñeco, M. Forcada
  • Computer Science
  • Proceedings of International Conference on Neural Networks (ICNN'97)
  • 1997
Minimum Error Rate Training in Statistical Machine Translation
Sequence to Sequence Learning with Neural Networks
The Mathematics of Statistical Machine Translation: Parameter Estimation
LSTM can Solve Hard Long Time Lag Problems
Backpropagation Through Time: What It Does and How to Do It
...
1
2
...