Learn More
We show how certain nonconvex optimization problems that arise in image processing and computer vision can be restated as convex minimization problems. This allows, in particular, the finding of global minimizers via standard convex minimization schemes. 1. Introduction. Image denoising and segmentation are two related, fundamental problems of computer(More)
We address the minimization of regularized convex cost functions which are customarily used for edge-preserving restoration and reconstruction of signals and images. In order to accelerate computation, the multiplicative and the additive half-quadratic reformulation of the original cost-function have been pioneered in Geman & Reynolds (1992) and Geman &(More)
This paper proposes a two-phase scheme for removing salt-and-pepper impulse noise. In the first phase, an adaptive median filter is used to identify pixels which are likely to be contaminated by noise (noise candidates). In the second phase, the image is restored using a specialized regularization method that applies only to those selected noise candidates.(More)
We present a theoretical study of the recovery of an unknown vector x ∈ I R p (a signal, an image) from noisy data y ∈ I R q by minimizing with respect to x a regularized cost-function F (x, y) = Ψ(x, y) + αΦ(x), where Ψ is a data-fidelity term, Φ is a smooth regularization term and α > 0 is a parameter. Typically, Ψ(x, y) = Ax − y 2 where A is a linear(More)
We consider the restoration of piecewise constant images where the number of the regions and their values are not fixed in advance, with a good difference of piecewise constant values between neighboring regions, from noisy data obtained at the output of a linear operator (e.g., a blurring kernel or a Radon transform). Thus we also address the generic(More)
We focus on the question of how the shape of a cost-function determines the features manifested by its local (and hence global) minimizers. Our goal is to check the possibility that the local minimizers of an unconstrained cost-function satisfy different subsets of affine constraints dependent on the data, hence the word " weak ". A typical example is the(More)
We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an in-depth analysis of the (local and global) minimizers of an objective function F d combining a quadratic data-fidelity term and an ℓ0 penalty applied to each entry of the sought-after(More)