One step Backpropagation Through Time for learning input mapping in reservoir computing applied to speech recognition

@article{Hermans2010OneSB,
  title={One step Backpropagation Through Time for learning input mapping in reservoir computing applied to speech recognition},
  author={Michiel Hermans and Benjamin Schrauwen},
  journal={Proceedings of 2010 IEEE International Symposium on Circuits and Systems},
  year={2010},
  pages={521-524}
}
Recurrent neural networks are very powerful engines for processing information that is coded in time, however, many problems with common training algorithms, such as Backpropagation Through Time, remain. Because of this, another important learning setup known as Reservoir Computing has appeared in recent years, where one uses an essentially untrained network to perform computations. Though very successful in many applications, using a random network can be quite inefficient when considering the… CONTINUE READING
7 Citations
14 References
Similar Papers

References

Publications referenced by this paper.
Showing 1-10 of 14 references

and H

  • W. Maass, T. Natschläger
  • Markram, “Real-time co mputing without stable…
  • 2002
2 Excerpts

Short term memory in echo state networks,

  • H. Jaeger
  • G erman National Research Center for Information…
  • 2001
1 Excerpt

and D

  • C. C. Aggarwal, A. Hinneburg
  • A. Keim, “On the surp rising behavior of…
  • 2001

Statistical Digital Signal Processing and Modeling

  • M. H. Hayes
  • Wiley
  • 1996

Similar Papers

Loading similar papers…