VR-SGD: A Simple Stochastic Variance Reduction Method for Machine Learning

@article{Shang2020VRSGDAS,
  title={VR-SGD: A Simple Stochastic Variance Reduction Method for Machine Learning},
  author={F. Shang and Kaiwen Zhou and Hongying Liu and James Cheng and I. Tsang and L. Zhang and D. Tao and Licheng Jiao},
  journal={IEEE Transactions on Knowledge and Data Engineering},
  year={2020},
  volume={32},
  pages={188-202}
}
  • F. Shang, Kaiwen Zhou, +5 authors Licheng Jiao
  • Published 2020
  • Computer Science, Mathematics
  • IEEE Transactions on Knowledge and Data Engineering
  • In this paper, we propose a simple variant of the original SVRG, called variance reduced stochastic gradient descent (VR-SGD. [...] Key Method We also design two different update rules for smooth and non-smooth objective functions, respectively, which means that VR-SGD can tackle non-smooth and/or non-strongly convex problems directly without any reduction techniques. Moreover, we analyze the convergence properties of VR-SGD for strongly convex problems, which show that VR-SGD attains linear convergence…Expand Abstract
    30 Citations
    A Stochastic Variance Reduced Extragradient Method for Sparse Machine Learning Problems
    Accelerated Variance Reduction Stochastic ADMM for Large-Scale Machine Learning.
    • 2
    • PDF
    Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization
    • 6
    • PDF
    ASVRG: Accelerated Proximal SVRG
    • 14
    • PDF
    Accelerating SGD using flexible variance reduction on large-scale datasets
    • 2
    Efficient Relaxed Gradient Support Pursuit for Sparsity Constrained Non-convex Optimization
    SAAGs: Biased stochastic variance reduction methods for large-scale learning
    • 3
    • Highly Influenced
    • PDF
    SAAGs: Biased Stochastic Variance Reduction Methods
    • 2
    • Highly Influenced
    L-SVRG and L-Katyusha with Arbitrary Sampling
    • 11
    • PDF
    Stochastic Recursive Gradient Support Pursuit and Its Sparse Representation Applications

    References

    SHOWING 1-10 OF 95 REFERENCES
    Fast Stochastic Variance Reduced Gradient Method with Momentum Acceleration for Machine Learning
    • 19
    • PDF
    Barzilai-Borwein Step Size for Stochastic Gradient Descent
    • 84
    • PDF
    Guaranteed Sufficient Decrease for Stochastic Variance Reduced Gradient Optimization
    • 6
    • PDF
    On Variance Reduction in Stochastic Gradient Descent and its Asynchronous Variants
    • 158
    • PDF
    ASVRG: Accelerated Proximal SVRG
    • 14
    • PDF
    Variance-Reduced and Projection-Free Stochastic Optimization
    • 106
    • PDF
    Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
    • 187
    • Highly Influential
    • PDF
    Accelerated Variance Reduced Stochastic ADMM
    • 20
    • PDF
    Stochastic Zeroth-order Optimization via Variance Reduction method
    • 11
    • PDF
    Accelerated Gradient Methods for Stochastic Optimization and Online Learning
    • 154
    • Highly Influential
    • PDF