#### Filter Results:

#### Publication Year

1993

2016

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

—Consider classes of signals that have a finite number of degrees of freedom per unit of time and call this number the rate of innovation. Examples of signals with a finite rate of innovation include streams of Diracs (e.g., the Poisson process), nonuniform splines, and piecewise polynomials. Even though these signals are not bandlimited, we show that they… (More)

This paper introduces a new approach to orthonormal wavelet image denoising. Instead of postulating a statistical model for the wavelet coefficients, we directly parametrize the denoising process as a sum of elementary nonlinear processes with unknown weights. We then minimize an estimate of the mean square error between the clean image and the denoised… (More)

We propose a new approach to image denoising, based on the image-domain minimization of an estimate of the mean squared error--Stein's unbiased risk estimate (SURE). Unlike most existing denoising algorithms, using the SURE makes it needless to hypothesize a statistical model for the noiseless image. A key point of our approach is that, although the… (More)

—Consider the problem of sampling signals which are not bandlimited, but still have a finite number of degrees of freedom per unit of time, such as, for example, nonuniform splines or piecewise polynomials, and call the number of degrees of freedom per unit of time the rate of innovation. Classical sampling theory does not enable a perfect reconstruction of… (More)

We consider the problem of optimizing the parameters of a given denoising algorithm for restoration of a signal corrupted by white Gaussian noise. To achieve this, we propose to minimize Stein's unbiased risk estimate (SURE) which provides a means of assessing the true mean-squared error (MSE) purely from the measured data without need for any knowledge… (More)

- Thierry Blu, Pina Marziliano, Lionel Coulot
- 2008

S ignal acquisition and reconstruction is at the heart of signal processing, and sampling theorems provide the bridge between the continuous and the discrete time worlds. The most celebrated and widely used sampling theorem is often attributed to Shannon (and many others, from Whittaker to Kotel'nikov and Nyquist, to name a few) and gives a sufficient… (More)

Based on the theory of approximation, this paper presents a unified analysis of interpolation and resampling techniques. An important issue is the choice of adequate basis functions. We show that, contrary to the common belief, those that perform best are not interpolating. By opposition to traditional interpolation, we call their use generalized… (More)

We consider the problem of interpolating a signal using a linear combination of shifted versions of a compactly-supported basis function phi(x). We first give the expression for the cases of phi's that have minimal support for a given accuracy (also known as "approximation order"). This class of functions, which we call maximal-order-minimal-support… (More)

We propose a general methodology (PURE-LET) to design and optimize a wide class of transform-domain thresholding algorithms for denoising images corrupted by mixed Poisson-Gaussian noise. We express the denoising process as a linear expansion of thresholds (LET) that we optimize by relying on a purely data-adaptive unbiased estimate of the mean-squared… (More)

—This chapter presents a survey of interpolation and resampling techniques in the context of exact, separable interpolation of regularly sampled data. In this context, the traditional view of interpolation is to represent an arbitrary continuous function as a discrete sum of weighted and shifted synthesis functions—in other words, a mixed convolution… (More)