A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets

@inproceedings{Roux2012ASG,
  title={A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets},
  author={Nicolas Le Roux and Mark W. Schmidt and Francis R. Bach},
  booktitle={NIPS},
  year={2012}
}
We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. In a machine learning context, numerical experiments indicate that the new algorithm can dramatically outperform standard algorithms, both in terms of… CONTINUE READING

Figures and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 328 CITATIONS, ESTIMATED 31% COVERAGE

FILTER CITATIONS BY YEAR

2013
2019

CITATION STATISTICS

  • 65 Highly Influenced Citations

  • Averaged 78 Citations per year over the last 3 years

  • 20% Increase in citations per year in 2018 over 2017

References

Publications referenced by this paper.
SHOWING 1-10 OF 33 REFERENCES

Similar Papers

Loading similar papers…