Still not there? Comparing Traditional Sequence-to-Sequence Models to Encoder-Decoder Neural Networks on Monotone String Translation Tasks

@inproceedings{Schnober2016StillNT,
  title={Still not there? Comparing Traditional Sequence-to-Sequence Models to Encoder-Decoder Neural Networks on Monotone String Translation Tasks},
  author={Carsten Schnober and Steffen Eger and Erik-L{\^a}n Do Dinh and Iryna Gurevych},
  booktitle={COLING},
  year={2016}
}
We analyze the performance of encoder-decoder neural models and compare them with wellknown established methods. The latter represent different classes of traditional approaches that are applied to the monotone sequence-to-sequence tasks OCR post-correction, spelling correction, grapheme-to-phoneme conversion, and lemmatization. Such tasks are of practical relevance for various higher-level research fields including digital humanities, automatic text correction, and speech recognition. We… CONTINUE READING
Recent Discussions
This paper has been referenced on Twitter 4 times over the past 90 days. VIEW TWEETS
8 Citations
42 References
Similar Papers

References

Publications referenced by this paper.
Showing 1-10 of 42 references

Similar Papers

Loading similar papers…