Corpus ID: 1903720

Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization

@inproceedings{Frostig2015UnregularizingAP,
  title={Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization},
  author={Roy Frostig and Rong Ge and S. Kakade and Aaron Sidford},
  booktitle={ICML},
  year={2015}
}
We develop a family of accelerated stochastic algorithms that minimize sums of convex functions. Our algorithms improve upon the fastest running time for empirical risk minimization (ERM), and in particular linear least-squares regression, across a wide range of problem settings. To achieve this, we establish a framework based on the classical proximal point algorithm. Namely, we provide several algorithms that reduce the minimization of a strongly convex function to approximate minimizations… Expand
119 Citations

Paper Mentions

Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
  • Yuchen Zhang, X. Lin
  • Computer Science, Mathematics
  • ICML
  • 2015
  • 199
  • PDF
The Double-Accelerated Stochastic Method for Regularized Empirical Risk Minimization
  • L. Liu, D. Tao
  • Computer Science
  • IEEE Transactions on Emerging Topics in Computational Intelligence
  • 2019
An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration
  • 8
  • Highly Influenced
  • PDF
Accelerated Doubly Stochastic Gradient Algorithm for Large-scale Empirical Risk Minimization
  • 3
  • PDF
From Low Probability to High Confidence in Stochastic Convex Optimization
  • 7
  • PDF
Accelerating Stochastic Gradient Descent for Least Squares Regression
  • 47
  • PDF
Variance Reduced Stochastic Gradient Descent with Sufficient Decrease
  • 3
  • Highly Influenced
  • PDF
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 24 REFERENCES
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
  • 558
  • PDF
An Accelerated Proximal Coordinate Gradient Method
  • 102
  • Highly Influential
  • PDF
Proximal Algorithms
  • 2,496
  • PDF
A Universal Catalyst for First-Order Optimization
  • 306
  • PDF
Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
  • 341
  • PDF
Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems
  • Y. Lee, Aaron Sidford
  • Mathematics, Computer Science
  • 2013 IEEE 54th Annual Symposium on Foundations of Computer Science
  • 2013
  • 192
  • PDF
Iterative Row Sampling
  • Mu Li, G. Miller, Richard Peng
  • Mathematics, Computer Science
  • 2013 IEEE 54th Annual Symposium on Foundations of Computer Science
  • 2013
  • 124
  • PDF
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
  • 12,085
  • PDF
Accelerating Stochastic Gradient Descent using Predictive Variance Reduction
  • 1,704
  • Highly Influential
  • PDF
A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets
  • 587
  • PDF
...
1
2
3
...