Distributed Proximal Splitting Algorithms with Rates and Acceleration

@inproceedings{Condat2021DistributedPS,
  title={Distributed Proximal Splitting Algorithms with Rates and Acceleration},
  author={Laurent Condat and Grigory Malinovsky and Peter Richt{\'a}rik},
  booktitle={Frontiers in Signal Processing},
  year={2021}
}
We analyze several generic proximal splitting algorithms well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new rates on the function value suboptimality or distance to the solution, as well as new accelerated versions, using varying stepsizes. In addition, we propose distributed variants of these algorithms, which can be accelerated as well. While most existing results are ergodic, our nonergodic results significantly broaden our… 

Figures from this paper

Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists

TLDR
This overview of recent proximal splitting algorithms presents them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics, and emphasizes that when the smooth term in the objective function is quadratic, convergence is guaranteed with larger values of the relaxation parameter than previously known.

Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms

TLDR
New primal-dual algorithms to minimize the sum of three convex functions, each having its own oracle, are introduced, which prove convergence to an exact solution with sublinear or linear rates, depending on strong convexity properties.

RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates

TLDR
A new primal–dual algorithm is proposed, in which the dual update is randomized; equivalently, the proximity operator of one of the function in the problem is replaced by a stochastic oracle.

Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms

We consider minimizing the sum of three convex functions, where the first one F is smooth, the second one is nonsmooth and proximable and the third one is the composition of a nonsmooth proximable

An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints

TLDR
This work considers the task of minimizing a smooth strongly convex function F ( x ) under the affine constraint K x = b, with an ora-cle providing evaluations of the gradient of F and multiplications by K and its transpose and proposes an accelerated primal–dual algorithm achieving these lower bounds.

Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness

TLDR
An adaptive restart scheme is developed and verified that restarts improve the ability of PDHG, EGM, and ADMM to high accuracy solutions to LP problems, and applies to the strictly more general class of sharp primal-dual problems.

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

DADAO is a novel decentralized asynchronous stochastic algorithm to minimize a sum of L -smooth and µ -strongly convex functions distributed over a time-varying connectivity network of size n . We

ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!

TLDR
ProxSkip is a surprisingly simple and provably provably effective method for minimizing the sum of a smooth and an expensive nonsmooth proximable function and offers an effec-tive acceleration of communication complexity.

Distributed Forward-Backward Methods without Central Coordination

In this work, we propose and analyse forward-backward-type algorithms for finding a zero in the sum of finitely many monotone operators, which are not based on reduction to a two operator inclusion

EF-BV: A Unified Theory of Error Feedback and Variance Reduction Mechanisms for Biased and Unbiased Compression in Distributed Optimization

TLDR
The general approach works with a new, larger class of compressors, which has two parameters, the bias and the variance, and includes unbiased and biased compressors as particular cases, and proves its linear convergence under certain conditions.

References

SHOWING 1-10 OF 60 REFERENCES

Proximal splitting algorithms: Relax them all!

TLDR
This work presents several existing proximal splitting algorithms and derives new ones, within a unified framework, which consists in applying splitting methods for monotone inclusions, like the forward-backward algorithm, in primal-dual product spaces with well-chosen metric, to derive new convergence theorems with larger parameter ranges.

Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists

TLDR
This overview of recent proximal splitting algorithms presents them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics, and emphasizes that when the smooth term in the objective function is quadratic, convergence is guaranteed with larger values of the relaxation parameter than previously known.

Proximal Splitting Algorithms: A Tour of Recent Advances, with New Twists.

TLDR
This overview of recent proximal splitting algorithms within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metric, is presented.

Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms

TLDR
New primal-dual algorithms to minimize the sum of three convex functions, each having its own oracle, are introduced, which prove convergence to an exact solution with sublinear or linear rates, depending on strong convexity properties.

Decentralized Proximal Gradient Algorithms With Linear Convergence Rates

TLDR
A general primal-dual algorithmic framework that unifies many existing state-of-the-art algorithms is proposed that establishes linear convergence of the proposed method to the exact minimizer in the presence of the nonsmooth term.

On the ergodic convergence rates of a first-order primal–dual algorithm

TLDR
The proofs of convergence for a first order primal–dual algorithm for convex optimization is revisited, with simpler proofs and more complete results that can deal with explicit terms and nonlinear proximity operators in spaces with quite general norms.

A Primal–Dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms

TLDR
This work brings together and notably extends several classical splitting schemes, like the forward–backward and Douglas–Rachford methods, as well as the recent primal–dual method of Chambolle and Pock designed for problems with linear composite terms.

A Generic Proximal Algorithm for Convex Optimization—Application to Total Variation Minimization

  • Laurent Condat
  • Mathematics, Computer Science
    IEEE Signal Processing Letters
  • 2014
TLDR
New optimization algorithms to minimize a sum of convex functions, which may be smooth or not and composed or not with linear operators, are proposed, which include various forms of regularized inverse problems in imaging.

Primal-Dual Splitting Algorithm for Solving Inclusions with Mixtures of Composite, Lipschitzian, and Parallel-Sum Type Monotone Operators

TLDR
This work brings together and notably extends various types of structured monotone inclusion problems and their solution methods and the application to convex minimization problems is given special attention.

Proximal Algorithms

TLDR
The many different interpretations of proximal operators and algorithms are discussed, their connections to many other topics in optimization and applied mathematics are described, some popular algorithms are surveyed, and a large number of examples of proxiesimal operators that commonly arise in practice are provided.
...