A Learning Algorithm for Continually Running Fully Recurrent Neural Networks

@article{Williams1989ALA,
  title={A Learning Algorithm for Continually Running Fully Recurrent Neural Networks},
  author={Ronald J. Williams and David Zipser},
  journal={Neural Computation},
  year={1989},
  volume={1},
  pages={270-280}
}
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks. These algorithms have (1) the advantage that they do not require a precisely defined training interval, operating while the network runs; and (2) the disadvantage that they require nonlocal communication in the network being trained and are computationally expensive. These… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 1,398 CITATIONS, ESTIMATED 25% COVERAGE

FILTER CITATIONS BY YEAR

1989
2019

CITATION STATISTICS

  • 197 Highly Influenced Citations

  • Averaged 117 Citations per year over the last 3 years

  • 30% Increase in citations per year in 2018 over 2017

References

Publications referenced by this paper.
SHOWING 1-10 OF 16 REFERENCES

Learning to represent state

  • J.
  • Unpublished master’s thesis, University of…
  • 1988
Highly Influential
3 Excerpts

A learning algorithm for continually running fully recurrent neural networks

  • R.J., D. Zipser.
  • ICS Technical Report 8805. La Jolla: University…
  • 1988

Experiments with sequential associative memo

  • S.I, D. King
  • 1988

Experiments with sequential associative memories

  • J. J. Hopfield
  • 1988

Similar Papers

Loading similar papers…