# A Generalized Forward-Backward Splitting

@article{Raguet2013AGF, title={A Generalized Forward-Backward Splitting}, author={Hugo Raguet and Mohamed-Jalal Fadili and Gabriel Peyr{\'e}}, journal={SIAM J. Imaging Sci.}, year={2013}, volume={6}, pages={1199-1226} }

This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form $F + \sum_{i=1}^n G_i$, where $F$ has a Lipschitz-continuous gradient and the $G_i$'s are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than $n = 1$ non-smooth function, our method generalizes it to the case of arbitrary $n$. Our method makes an explicit use of the regularity of $F$ in…

## Figures from this paper

## 300 Citations

Iteration-Complexity of a Generalized Forward Backward Splitting Algorithm

- MathematicsICASSP 2014
- 2013

This paper derives iteration-complexity bounds (pointwise and ergodic) for the inexact version of GFB to obtain an approximate solution based on an easily verifiable termination criterion and proves complexity bounds for relaxed and inexact fixed point iterations built from composition of nonexpansive averaged operators.

Preconditioning of a Generalized Forward-Backward Splitting and Application to Optimization on Graphs

- Computer ScienceSIAM J. Imaging Sci.
- 2015

The preconditioning of a generalized forward-backward splitting algorithm for finding a zero of a sum of maximally monotone operators with cocoercive consequences can handle large-scale, nonsmooth, convex optimization problems structured on graphs.

On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence

- Computer Science, MathematicsSIAM J. Optim.
- 2019

A framework for quasi-Newton forward--backward splitting algorithms (proximal quasi- newton methods) with a metric induced by diagonal $\pm$ rank-$r$ symmetric positive definite matrices is introduced, which allows for a highly efficient evaluation of the proximal mapping.

A note on the forward-Douglas–Rachford splitting for monotone inclusion and convex optimization

- MathematicsOptim. Lett.
- 2019

It is shown that the extension to an arbitrary number of maximally monotone operators in the splitting is a straightforward extension of a fixed-point algorithm proposed by us as a generalization of the forward–backward splitting algorithm.

Local and Global Convergence of an Inertial Version of Forward-Backward Splitting

- Computer Science, Mathematics
- 2015

This paper first applies a global Lyapunov analysis to I-FBS and proves weak convergence of the iterates to a minimizer in a real Hilbert space, and shows that the algorithms achieve local linear convergence for "sparse optimization", which is the important special case where the nonsmooth term is the $\ell_1$-norm.

A First-Order Splitting Method for Solving a Large-Scale Composite Convex Optimization Problem

- Computer Science, MathematicsJournal of Computational Mathematics
- 2019

Several iterative algorithms that involve only computing the gradient of the differentiable function and proximity operators of related convex functions are developed that outperform other algorithms for solving a multi-block composite convex optimization problem.

Fast convergence of generalized forward-backward algorithms for structured monotone inclusions

- Mathematics
- 2021

In this paper, we develop rapidly convergent forward-backward algorithms for computing zeroes of the sum of finitely many maximally monotone operators. A modification of the classical…

An Inertial Semi-forward-reflected-backward Splitting and Its Application

- MathematicsActa Mathematica Sinica, English Series
- 2022

Inertial methods play a vital role in accelerating the convergence speed of optimization algorithms. This work is concerned with an inertial semi-forward-reflected-backward splitting algorithm of…

Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions

- Mathematics, Computer ScienceComput. Optim. Appl.
- 2017

This paper analyzes a family of generalized inertial proximal splitting algorithms (GIPSA) for solving convex composite minimization problems in a Hilbert space and proves local linear convergence under either restricted strong convexity or a strict complementarity condition.

Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization

- MathematicsSIAM J. Optim.
- 2020

A splitting scheme which hybridizes generalized conditional gradient with a proximal step which is carried out for a wide choice of algorithm parameters satisfying so called "open loop" rules, and shows asymptotic feasibility with respect to the affine constraint, boundedness of the dual multipliers, and convergence of the Lagrangian values to the saddle-point optimal value.

## References

SHOWING 1-10 OF 111 REFERENCES

Gradient methods for minimizing composite functions

- Mathematics, Computer ScienceMath. Program.
- 2013

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is…

Nested Iterative Algorithms for Convex Constrained Image Recovery Problems

- Mathematics, Computer ScienceSIAM J. Imaging Sci.
- 2009

The weak convergence of the proposed algorithms is proved and it is shown that, under some assumptions, it remains possible to apply these methods to the considered optimization problem by making use of a quadratic extension technique.

Convergence Rates in Forward-Backward Splitting

- MathematicsSIAM J. Optim.
- 1997

For the first time in a general setting, global and local contraction rates are derived and are derived in a form which makes it possible to determine the optimal step size relative to certain constants associated with the given problem.

Iteration-complexity of block-decomposition algorithms and the alternating minimization augmented Lagrangian method

- Mathematics, Computer Science
- 2010

A framework of block-decomposition prox-type algorithms for solving the monotone inclusion problem and it is shown that any method in this framework is also a special instance of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter.

Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers

- Mathematics, Computer ScienceSIAM J. Optim.
- 2013

A framework of block-decomposition prox-type algorithms for solving the monotone inclusion problem and shows that any method in this framework is also a special instance of the hybrid proximal extragradient (HPE) method introduced by Solodov and Svaiter is shown.

General Projective Splitting Methods for Sums of Maximal Monotone Operators

- MathematicsSIAM J. Control. Optim.
- 2009

A general projective framework for finding a zero of the sum of $n$ maximal monotone operators over a real Hilbert space is described, which gives rise to a family of splitting methods of unprecedented flexibility.

A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings

- MathematicsSIAM J. Control. Optim.
- 2000

A modification to the forward-backward splitting method for finding a zero of the sum of two maximal monotone mappings is proposed, under which the method converges assuming only the forward mapping is (Lipschitz) continuous on some closed convex subset of its domain.

Signal Recovery by Proximal Forward-Backward Splitting

- MathematicsMultiscale Model. Simul.
- 2005

We show that various inverse problems in signal recovery can be formulated as the generic problem of minimizing the sum of two convex functions with certain regularity properties. This formulation…

Gradient methods for minimizing composite objective function

- Computer Science, Mathematics
- 2007

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and…

A proximal-based decomposition method for convex minimization problems

- Mathematics, Computer ScienceMath. Program.
- 1994

This paper presents a decomposition method for solving convex minimization problems that preserves the good features of the proximal method of multipliers, with the additional advantage that it leads to a decoupling of the constraints, and is thus suitable for parallel implementation.