Forward-backward retraining of recurrent neural networks

@inproceedings{Senior1995ForwardbackwardRO,
  title={Forward-backward retraining of recurrent neural networks},
  author={Andrew W. Senior and Anthony J. Robinson},
  booktitle={NIPS},
  year={1995}
}
This paper describes the training of a recurrent neural network as the letter posterior probability estimator for a hidden Markov model, off-line handwriting recognition system. The network estimates posterior distributions for each of a series of frames representing sections of a handwritten word. The supervised training algorithm, backpropagation through time, requires target outputs to be provided for each frame. Three methods for deriving these targets are presented. A novel method based… CONTINUE READING
Highly Cited
This paper has 38 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 25 extracted citations

References

Publications referenced by this paper.
Showing 1-7 of 7 references

The application of recurrent nets to phone probability estimation

  • A. Robinson
  • 1994
Highly Influential
7 Excerpts

O -line Cursive Handwriting Recognition using Recurrent

  • A. W. Senior
  • 1994
Highly Influential
4 Excerpts

Connectionist Speech Recognition: A Hybrid

  • H. Bourlard, N. Morgan
  • 1993
2 Excerpts

Learning internal representations by eIIor propagation

  • D. E. RUMELHART, G. E. HINTON, R. J. WILLIAMS
  • Parallel Distributed Processing : Explorations in…
  • 1986
2 Excerpts

Similar Papers

Loading similar papers…