Timescale Separation in Recurrent Neural Networks


Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still… (More)
DOI: 10.1162/NECO_a_00740


3 Figures and Tables