A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks

@article{Schmidhuber1992AFS,
  title={A Fixed Size Storage O(n3) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks},
  author={J. Schmidhuber},
  journal={Neural Computation},
  year={1992},
  volume={4},
  pages={243-248}
}
The real-time recurrent learning (RTRL) algorithm (Robinson and Fallside 1987; Williams and Zipser 1989) requires O(n4) computations per time step, where n is the number of noninput units. I describe a method suited for on-line learning that computes exactly the same gradient and requires fixed-size storage of the same order but has an average time complexity per time step of O(n3). 
131 Citations
TRTRL: A Localized Resource-Efficient Learning Algorithm for Recurrent Neural Netowrks
  • D. Budik, I. Elhanany
  • Computer Science
  • 2006 49th IEEE International Midwest Symposium on Circuits and Systems
  • 2006
Fast and Scalable Recurrent Neural Network Learning based on Stochastic Meta-Descent
Learning to Control Fast-Weight Memories: An Alternative to Dynamic Recurrent Networks
A Fast and Scalable Recurrent Neural Network Based on Stochastic Meta Descent
Locally Connected Recurrent
Backpropagation-decorrelation: online recurrent learning with O(N) complexity
  • J. Steil
  • Computer Science
  • 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
  • 2004
An analog VLSI recurrent neural network learning a continuous-time trajectory
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 12 REFERENCES
A Subgrouping Strategy that Reduces Complexity and Speeds Up Learning in Recurrent Networks
  • D. Zipser
  • Computer Science
  • Neural Computation
  • 1989
Experimental Analysis of the Real-time Recurrent Learning Algorithm
A learning algorithm for analog, fully recurrent neural networks
  • M. Gherrity
  • Computer Science
  • International 1989 Joint Conference on Neural Networks
  • 1989
Learning State Space Trajectories in Recurrent Neural Networks
Time Dependent Adaptive Neural Networks
Adaptive decomposition of time
Complexity of exact gradient computation algorithms for recurrent neural networks
  • Tech. Rep. NU-CCS-89-27, Boston: Northeastern University, College of Computer Science. Williams, R. J., and Peng, J. 1990. An efficient gradient-based algorithm for on-line training of recurrent network trajectories.
  • 1989
...
1
2
...