• Corpus ID: 11535680

# Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition

@article{Schmidt2013FastCO,
title={Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition},
author={Mark W. Schmidt and Nicolas Le Roux},
journal={arXiv: Optimization and Control},
year={2013}
}
• Published 28 August 2013
• Mathematics, Computer Science
• arXiv: Optimization and Control

### A globally convergent incremental Newton method

• Computer Science, Mathematics
Math. Program.
• 2015
It is shown that the incremental Newton method for minimizing the sum of a large number of strongly convex functions is globally convergent for a variable stepsize rule and under a gradient growth condition, convergence rate is linear for both variable and constant stepsize rules.

## References

SHOWING 1-6 OF 6 REFERENCES

### Incremental Gradient Algorithms with Stepsizes Bounded Away from Zero

• M. Solodov
• Computer Science
Comput. Optim. Appl.
• 1998
The first convergence results of any kind for this computationally important case are derived and it is shown that a certain ε-approximate solution can be obtained and the linear dependence of ε on the stepsize limit is established.

### An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule

• P. Tseng
• Computer Science
SIAM J. Optim.
• 1998
We consider an incremental gradient method with momentum term for minimizing the sum of continuously differentiable functions. This method uses a new adaptive stepsize rule that decreases the

### Robust Stochastic Approximation Approach to Stochastic Programming

• Computer Science, Mathematics
SIAM J. Optim.
• 2009
It is intended to demonstrate that a properly modified SA approach can be competitive and even significantly outperform the SAA method for a certain class of convex stochastic problems.

### Introductory Lectures on Convex Optimization - A Basic Course

It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments.