# Further properties of the forward–backward envelope with applications to difference-of-convex programming

@article{Liu2017FurtherPO, title={Further properties of the forward–backward envelope with applications to difference-of-convex programming}, author={Tianxiang Liu and Ting Kei Pong}, journal={Computational Optimization and Applications}, year={2017}, volume={67}, pages={489-520} }

In this paper, we further study the forward–backward envelope first introduced in Patrinos and Bemporad (Proceedings of the IEEE Conference on Decision and Control, pp 2358–2363, 2013) and Stella et al. (Comput Optim Appl, doi:10.1007/s10589-017-9912-y, 2017) for problems whose objective is the sum of a proper closed convex function and a twice continuously differentiable possibly nonconvex function with Lipschitz continuous gradient. We derive sufficient conditions on the original problem for…

## 33 Citations

Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms

- Mathematics, Computer ScienceSIAM J. Optim.
- 2018

It is shown that the forward-backward envelope (FBE), an exact and strictly continuous penalty function for the original cost, still enjoys favorable first- and second-order properties which are key for the convergence results of ZeroFPR.

Forward–backward quasi-Newton methods for nonsmooth optimization problems

- Mathematics, Computer ScienceComput. Optim. Appl.
- 2017

This work proposes an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka–Łojasiewicz property at its critical points, and analysis of superlinear convergence is based on an extension of the Dennis and Moré theorem.

A proximal difference-of-convex algorithm with extrapolation

- Mathematics, Computer ScienceComput. Optim. Appl.
- 2018

A proximal difference-of-convex algorithm with extrapolation to possibly accelerate the proximal DCA, and it is shown that any cluster point of the sequence generated by the algorithm is a stationary points of the DC optimization problem for a fairly general choice of extrapolation parameters.

Proximal envelopes: Smooth optimization algorithms for nonsmooth problems

- Computer Science
- 2017

An interpretation to proximal algorithms as unconstrained gradient methods over an associated function function is provided, and proximal envelopes provide a link between nonsmooth and smooth optimization, and allow for the application of more efficient and robust smooth optimization algorithms to the solution of nonsmoot, possibly constrained problems.

Retraction-based first-order feasible sequential quadratic programming methods for difference-of-convex programs with smooth inequality and simple geometric constraints

- 2021

In this paper, we propose first-order feasible sequential quadratic programming (SQP) methods for difference-of-convex (DC) programs with smooth inequality and simple geometric constraints. Different…

Bregman forward-backward splitting for nonconvex composite optimization: superlinear convergence to nonisolated critical points

- Mathematics
- 2019

We introduce Bella, a locally superlinearly convergent Bregman forward-backward splitting method for minimizing the sum of two nonconvex functions, one of which satisfying a relative smoothness…

The modified second APG method for DC optimization problems

- Computer Science, MathematicsOptim. Lett.
- 2019

A variant of the second accelerated proximal gradient method introduced by Nesterov and Auslender and Teboulle for solving the minimization of DC functions (difference of two convex functions) is constructed.

Proximal Gradient Algorithms under Local Lipschitz Gradient Continuity: A Convergence and Robustness Analysis of PANOC

- Mathematics
- 2021

Composite optimization offers a powerful modeling tool for a variety of applications and is often numerically solved by means of proximal gradient methods. In this paper, we consider fully nonconvex…

An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems

- 2021

In this paper, we propose a new method for a class of difference-of-convex (DC) optimization problems, whose objective is the sum of a smooth function and a possibly nonprox-friendly DC function. The…

Newton-Type Alternating Minimization Algorithm for Convex Optimization

- Mathematics, Computer ScienceIEEE Transactions on Automatic Control
- 2019

Experiments show that using limited-memory directions in NAMA greatly improves the convergence speed over AMA and its accelerated variant, and the proposed method is well suited for embedded applications and large-scale problems.

## References

SHOWING 1-10 OF 40 REFERENCES

Forward–backward quasi-Newton methods for nonsmooth optimization problems

- Mathematics, Computer ScienceComput. Optim. Appl.
- 2017

This work proposes an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka–Łojasiewicz property at its critical points, and analysis of superlinear convergence is based on an extension of the Dennis and Moré theorem.

Calculus of the Exponent of Kurdyka–Łojasiewicz Inequality and Its Applications to Linear Convergence of First-Order Methods

- Mathematics, Computer ScienceFound. Comput. Math.
- 2018

The Kurdyka–Łojasiewicz exponent is studied, an important quantity for analyzing the convergence rate of first-order methods, and various calculus rules are developed to deduce the KL exponent of new (possibly nonconvex and nonsmooth) functions formed from functions with known KL exponents.

Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward–backward splitting, and regularized Gauss–Seidel methods

- Mathematics, Computer ScienceMath. Program.
- 2013

This work proves an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance, that guarantees the convergence of bounded sequences under the assumption that the function f satisfies the Kurdyka–Łojasiewicz inequality.

Semi-Smooth Second-order Type Methods for Composite Convex Programs

- Mathematics
- 2016

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator…

A coordinate gradient descent method for nonsmooth separable minimization

- Mathematics, Computer ScienceMath. Program.
- 2009

A (block) coordinate gradient descent method for solving this class of nonsmooth separable problems and establishes global convergence and, under a local Lipschitzian error bound assumption, linear convergence for this method.

Minimization of ℓ1-2 for Compressed Sensing

- Mathematics, Computer ScienceSIAM J. Sci. Comput.
- 2015

A sparsity oriented simulated annealing procedure with non-Gaussian random perturbation is proposed and the almost sure convergence of the combined algorithm (DCASA) to a global minimum is proved.

Penalty Methods for a Class of Non-Lipschitz Optimization Problems

- Mathematics, Computer ScienceSIAM J. Optim.
- 2016

A penalty method whose subproblems are solved via a nonmonotone proximal gradient method with a suitable update scheme for the penalty parameters is discussed, and the convergence of the algorithm to a KKT point of the constrained problem is proved.

Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Lojasiewicz Inequality

- Mathematics, Computer ScienceMath. Oper. Res.
- 2010

A convergent proximal reweighted l1 algorithm for compressive sensing and an application to rank reduction problems is provided, which depends on the geometrical properties of the function L around its critical points.

A unified approach to error bounds for structured convex optimization problems

- Computer Science, MathematicsMath. Program.
- 2017

A new framework for establishing error bounds for a class of structured convex optimization problems, in which the objective function is the sum of a smooth convex function and a general closed proper convexfunction, is presented.

The Moreau envelope function and proximal mapping in the sense of the Bregman distance

- Mathematics
- 2012

Abstract In this paper, we explore some properties of the Moreau envelope function e λ f ( x ) of f and the associated proximal mapping P λ f ( x ) in the sense of the Bregman distance induced by a…