Learn More
We consider the class of iterative shrinkage-thresholding algorithms (ISTA) for solving linear inverse problems arising in signal/image processing. This class of methods, which can be viewed as an extension of the classical gradient algorithm, is attractive due to its simplicity and thus is adequate for solving large-scale problems even with dense matrix(More)
This paper studies gradient-based schemes for image denoising and deblurring problems based on the discretized total variation (TV) minimization model with constraints. We derive a fast algorithm for the constrained TV-based image deburring problem. To achieve this task, we combine an acceleration of the well known dual approach to the denoising problem(More)
We introduce a proximal alternating linearized minimization (PALM) algorithm for solving a broad class of nonconvex and nonsmooth minimization problems. Building on the powerful KurdykaLojasiewicz property, we derive a self-contained convergence analysis framework and establish that each bounded sequence generated by PALM globally converges to a critical(More)
The mirror descent algorithm (MDA) was introduced by Nemirovsky and Yudin for solving convex optimization problems. This method exhibits an e3ciency estimate that is mildly dependent in the decision variables dimension, and thus suitable for solving very large scale optimization problems. We present a new derivation and analysis of this algorithm. We show(More)
This chapter presents in a self-contained manner recent advances in the design and analysis of gradient-based schemes for specially structured smooth and nonsmooth minimization problems. We focus on the mathematical elements and ideas for building fast gradient-based methods and derive their complexity bounds. Throughout the chapter, the resulting schemes(More)
We propose a unifying framework that combines smoothing approximation with fast first order algorithms for solving nonsmooth convex minimization problems. We prove that independently of the structure of the convex nonsmooth function involved, and of the given fast first order iterative scheme, it is always possible to improve the complexity rate and reach(More)
The optimized certainty equivalent (OCE) is a decision theoretic criterion based on a utility function, that was first introduced by the authors in 1986. This paper re-examines this fundamental concept, studies and extends its main properties, and put it in perspective to recent concepts of risk measures. We show that the negative of the OCE naturally(More)