Dynamic learning rate optimization of the backpropagation algorithm

  title={Dynamic learning rate optimization of the backpropagation algorithm},
  author={Xiao-Hu Yu and Guo-An Chen and Shixin Cheng},
  journal={IEEE transactions on neural networks},
  volume={6 3},
It has been observed by many authors that the backpropagation (BP) error surfaces usually consist of a large amount of flat regions as well as extremely steep regions. As such, the BP algorithm with a fixed learning rate will have low efficiency. This paper considers dynamic learning rate optimization of the BP algorithm using derivative information. An efficient method of deriving the first and second derivatives of the objective function with respect to the learning rate is explored, which… CONTINUE READING


Publications citing this paper.
Showing 1-10 of 78 extracted citations

A new dynamic optimal learning rate for a two-layer neural network

2012 International Conference on System Science and Engineering (ICSSE) • 2012
View 4 Excerpts
Highly Influenced

Uncertainty quantification methods for neural networks pattern recognition

2017 IEEE Symposium Series on Computational Intelligence (SSCI) • 2017
View 1 Excerpt

A new fast-F-CONFIS training of fully-connected neuro-fuzzy inference system

2015 International Conference on Informative and Cybernetics for Computational Social Systems (ICCSS) • 2015
View 1 Excerpt

Similar Papers

Loading similar papers…