Effects of Noise on Convergence and Generalization in Recurrent Networks

@inproceedings{Jim1994EffectsON,
  title={Effects of Noise on Convergence and Generalization in Recurrent Networks},
  author={Kam-Chuen Jim and Bill G. Horne and C. Lee Giles},
  booktitle={NIPS},
  year={1994}
}
We introduce and study methods of inserting synaptic noise into dynamically-driven recurrent neural networks and show that applying a controlled amount of noise during training may improve convergence and generalization. In addition, we analyze the effects of each noise parameter (additive vs. multiplicative, cumulative vs. non-cumulative, per time step vs. per string) and predict that best overall performance can be achieved by injecting additive noise at each time step. Extensive simulations… CONTINUE READING

Citations

Publications citing this paper.
Showing 1-10 of 18 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 12 references

Bishop . Training with noise is equivalent to Tikhonov Regularization

  • M. Chris
  • Neural Computation
  • 1994

Synaptic noise in dynamically-driven recurrent neural networks: Convergence and generalization. Technical Report UMIACS-TR-94-89 and CS-TR-3322, Institute for Advanced Computer Studies

  • Kam Jim, C. L. Giles, B. G. Horne
  • 1994

Synaptic noise in dynamicallydriven recurrent neural networks : Convergence and generalization

  • C. L. Giles Kam Jim, B. G. Horne.
  • 1994

A stochastic version of the delta rule

  • Stephen Jos e Hanson
  • Physica D.,
  • 1990
1 Excerpt

Similar Papers

Loading similar papers…