• Corpus ID: 119176914

Linear and sublinear convergence rates for a subdifferentiable distributed deterministic asynchronous Dykstra's algorithm

  title={Linear and sublinear convergence rates for a subdifferentiable distributed deterministic asynchronous Dykstra's algorithm},
  author={C. Pang},
  journal={arXiv: Optimization and Control},
  • C. Pang
  • Published 30 June 2018
  • Mathematics
  • arXiv: Optimization and Control
In two earlier papers, we designed a distributed deterministic asynchronous algorithm for minimizing the sum of subdifferentiable and proximable functions and a regularizing quadratic on time-varying graphs based on Dykstra's algorithm, or block coordinate dual ascent. Each node in the distributed optimization problem is the sum of a known regularizing quadratic and a function to be minimized. In this paper, we prove sublinear convergence rates for the general algorithm, and a linear rate of… 

Figures from this paper


On the Convergence Rate of Incremental Aggregated Gradient Algorithms
It is shown that this deterministic incremental aggregated gradient method has global linear convergence and the convergence rate is characterized, and an aggregated method with momentum is considered and its linear convergence is demonstrated.
Two generalizations of Dykstra’s cyclic projections algorithm
Two generalizations ofykstra’s cyclic projections algorithm are presented, which allow the number of setsCi to beinfinite rather than finite; secondly, they allow arandom, rather than cyclic, ordering of the sets Ci.
Achieving Geometric Convergence for Distributed Optimization Over Time-Varying Graphs
This paper introduces a distributed algorithm, referred to as DIGing, based on a combination of a distributed inexact gradient method and a gradient tracking technique that converges to a global and consensual minimizer over time-varying graphs.
On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
  • A. Beck
  • Mathematics, Computer Science
    SIAM J. Optim.
  • 2015
This paper analyzes the convergence rate of the alternating minimization method and establishes a nonasymptotic sublinear rate of convergence where the multiplicative constant depends on the minimal block Lipschitz constant, and studies the convergence properties of a decomposition-based approach designed to solve convex problems involving sums of norms.
A cyclic projection algorithm via duality
We consider the problem of finding the projection of a given point in a Hilbert space onto the intersection of finitely many closed convex sets. A very simple iterative procedure was established by
Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server
This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems and establishes linear convergence rate, give explicit expressions for step-size choices that guarantee convergence to the optimum, and bound the associated convergence factors.
On the Convergence of Block Coordinate Descent Type Methods
This paper analyzes the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order and establishes global sublinear rate of convergence.
Asynchronous distributed optimization using a randomized alternating direction method of multipliers
A new class of random asynchronous distributed optimization methods that generalize the standard Alternating Direction Method of Multipliers to an asynchronous setting where isolated components of the network are activated in an uncoordinated fashion are introduced.
On the O(1=k) convergence of asynchronous distributed alternating Direction Method of Multipliers
  • Ermin Wei, A. Ozdaglar
  • Computer Science, Mathematics
    2013 IEEE Global Conference on Signal and Information Processing
  • 2013
A novel asynchronous ADMM based distributed method is presented for the general formulation of a network of agents that are cooperatively solving a global optimization problem and it is shown that it converges at the rate O (1=k).
Asynchronous block-iterative primal-dual decomposition methods for monotone inclusions
This work proposes new primal-dual decomposition algorithms for solving systems of inclusions involving sums of linearly composed maximally monotone operators, and presents two related methods: the first method provides weakly convergent primal and dual sequences under general conditions, while the second is a variant in which strong convergence is guaranteed without additional assumptions.