Corpus ID: 214803085

Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms

@article{Salim2020DualizeSR,
  title={Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms},
  author={A. Salim and Laurent Condat and Konstantin Mishchenko and Peter Richt{\'a}rik},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.02635}
}
We introduce new primal-dual algorithms to minimize the sum of three convex functions, each having its own oracle. Namely, the first one is differentiable, smooth and possibly stochastic, the second is proximable, and the last one is a composition of a proximable function with a linear map. By leveraging variance reduction, we prove convergence to an exact solution with sublinear or linear rates, depending on strong convexity properties. The proposed theory is simple and unified by the umbrella… Expand
Distributed Proximal Splitting Algorithms with Rates and Acceleration
An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints
Decentralized Dual Proximal Gradient Algorithms for Non-Smooth Constrained Composite Optimization Problems
  • Huaqing Li, Jinhui Hu, +4 authors Tingwen Huang
  • Computer Science
  • IEEE Transactions on Parallel and Distributed Systems
  • 2021
Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization

References

SHOWING 1-10 OF 40 REFERENCES
Stochastic Three-Composite Convex Minimization
A Primal–Dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms
Stochastic Variance Reduction Methods for Saddle-Point Problems
Stochastic Three-Composite Convex Minimization with a Linear Operator
A fully stochastic primal-dual algorithm
Convex Optimization: Algorithms and Complexity
A Stochastic Proximal Point Algorithm for Saddle-Point Problems
...
1
2
3
4
...