Corpus ID: 30924997

Stochastic Recursive Gradient Algorithm for Nonconvex Optimization

@article{Nguyen2017StochasticRG,
  title={Stochastic Recursive Gradient Algorithm for Nonconvex Optimization},
  author={Lam M. Nguyen and J. Liu and K. Scheinberg and Martin Tak{\'a}c},
  journal={ArXiv},
  year={2017},
  volume={abs/1705.07261}
}
  • Lam M. Nguyen, J. Liu, +1 author Martin Takác
  • Published 2017
  • Computer Science, Mathematics
  • ArXiv
  • In this paper, we study and analyze the mini-batch version of StochAstic Recursive grAdient algoritHm (SARAH), a method employing the stochastic recursive gradient, for solving empirical loss minimization for the case of nonconvex losses. We provide a sublinear convergence rate (to stationary points) for general nonconvex functions and a linear convergence rate for gradient dominated functions, both of which have some advantages compared to other modern stochastic gradient algorithms for… CONTINUE READING
    58 Citations

    Figures, Tables, and Topics from this paper

    Complexities in Projection-Free Stochastic Non-convex Minimization
    • 16
    • PDF
    Riemannian Stochastic Recursive Gradient Algorithm
    • 16
    • Highly Influenced
    • PDF
    Stochastic Proximal Gradient Methods for Non-smooth Non-Convex Regularized Problems.
    • 5
    • Highly Influenced
    Momentum with Variance Reduction for Nonconvex Composition Optimization
    • 2
    • Highly Influenced
    • PDF
    Recent Theoretical Advances in Non-Convex Optimization
    • 1
    • PDF
    A linearly convergent stochastic recursive gradient method for convex optimization
    • 1
    • PDF
    Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD
    • 5
    • PDF
    Stochastic Nested Variance Reduction for Nonconvex Optimization
    • 52
    • PDF

    References

    SHOWING 1-10 OF 28 REFERENCES
    Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
    • 646
    • Highly Influential
    • PDF
    Stochastic Variance Reduction for Nonconvex Optimization
    • 374
    • Highly Influential
    • PDF
    Cubic regularization of Newton method and its global performance
    • 589
    • PDF
    SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient
    • 232
    • PDF
    Variance Reduction for Faster Non-Convex Optimization
    • 278
    • PDF
    Optimization with First-Order Surrogate Functions
    • J. Mairal
    • Mathematics, Computer Science
    • ICML
    • 2013
    • 146
    • PDF
    Accelerating Stochastic Gradient Descent using Predictive Variance Reduction
    • 1,686
    • PDF
    Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
    • 198
    • PDF
    Minimizing finite sums with the stochastic average gradient
    • 799
    • Highly Influential
    • PDF
    Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
    • 6,536
    • Highly Influential
    • PDF