Approximate Douglas–Rachford algorithm for two-sets convex feasibility problems

@article{DazMilln2021ApproximateDA,
  title={Approximate Douglas–Rachford algorithm for two-sets convex feasibility problems},
  author={R. D{\'i}az Mill{\'a}n and Orizon Pereira Ferreira and Julien Ugon},
  journal={Journal of Global Optimization},
  year={2021},
  pages={1-16}
}
In this paper, we propose a new algorithm combining the Douglas–Rachford (DR) algorithm and the Frank–Wolfe algorithm, also known as the conditional gradient (CondG) method, for solving the classic convex feasibility problem. Within the algorithm, which will be named Approximate Douglas–Rachford (ApDR) algorithm , the CondG method is used as a subroutine to compute feasible inexact projections on the sets under consideration, and the ApDR iteration is defined based on the DR iteration. The ApDR… 

Alternating Linear Minimization: Revisiting von Neumann's alternating projections

In 1933 von Neumann proved a beautiful result that one can approximate a point in the intersection of two convex sets by alternating projections, i.e., successively projecting on one set and then the

The Douglas–Rachford algorithm for convex and nonconvex feasibility problems

This self-contained tutorial develops the convergence theory of projection algorithms within the framework of fixed point iterations, and explains how to devise useful feasibility problem formulations, and demonstrates the application of the Douglas–Rachford method to said formulations.

Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng’s F-B four-operator splitting method for solving monotone inclusions

This paper proposes and studies the iteration complexity of an inexact DRS method and a Douglas-Rachford-Tseng’s forward-backward (F-B) splitting method for solving two-operator and four-operator monotone inclusions, respectively.

Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility

This work considers elementary methods based on projections for solving a sparse feasibility problem without employing convex heuristics, and applies different analytical tools that allow us to show global linear convergence of alternating projections under familiar constraint qualifications.

Alternating conditional gradient method for convex feasibility problems

The classical convex feasibility problem in a finite dimensional Euclidean space is studied and alternating projection method with CondG method is combined to design a new method which can be seen as an inexact feasible version of alternate projection method.

Relative-error approximate versions of Douglas–Rachford splitting and special cases of the ADMM

A new approximate version of the alternating direction method of multipliers (ADMM) which uses a relative error criterion and is computationally evaluate on several classes of test problems and finds that it significantly outperforms several alternatives on one problem class.

Faster Rates for the Frank-Wolfe Method over Strongly-Convex Sets

This paper proves that the vanila FW method converges at a rate of 1/t2, and shows that various balls induced by lp norms, Schatten norms and group norms are strongly convex on one hand and on the other hand, linear optimization over these sets is straightforward and admits a closed-form solution.

Projection-free accelerated method for convex optimization

Under reasonable assumptions, it is shown that an ϵ-approximate solution is obtained in at most gradient evaluations and linear oracle calls for the proposed method to obtain an approximate solution.

Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization

A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.

Newton’s method with feasible inexact projections for solving constrained generalized equations

A new version of Newton’s method for solving constrained generalized equations with a procedure to obtain a feasible inexact projection is addressed, using the contraction mapping principle to establish a local analysis of the proposed method under appropriate assumptions.

Second-order Conditional Gradient Sliding.

The SOCGS algorithm is presented, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly and is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly.