Convergence Rate of Stochastic Gradient Search in the Case of Multiple and Non-Isolated Minima

@inproceedings{Tadic2009ConvergenceRO,
  title={Convergence Rate of Stochastic Gradient Search in the Case of Multiple and Non-Isolated Minima},
  author={Vladislav B. Tadic},
  year={2009}
}
The convergence rate of stochastic gradient search is analyzed in this paper. Using arguments based on differential geometry and Lojasiewicz inequalities, tight bounds on the convergence rate of general stochastic gradient algorithms are derived. As opposed to the existing results, the results presented in this paper allow the objective function to have multiple, non-isolated minima, impose no restriction on the values of the Hessian (of the objective function) and do not require the algorithm… CONTINUE READING

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 27 REFERENCES

Stochastic Approximation and Recursive Algorithms and Applications

  • H. J. Kushner, G. G. Yin
  • 2nd edition, Springer-Verlag
  • 2003
Highly Influential
4 Excerpts

On actor-critic algorithms

  • V. R. Konda, J. N. Tsitsiklis
  • SIAM Journal on Control and Optimization, 42
  • 2003
1 Excerpt

A Primer of Real Analytic Functions

  • S. G. Krantz, H. R. Parks
  • Birikhäuser
  • 2002
1 Excerpt

Similar Papers

Loading similar papers…