#### Filter Results:

- Full text PDF available (32)

#### Publication Year

1992

2016

- This year (0)
- Last 5 years (8)
- Last 10 years (13)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a… (More)

With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle ooers dramatic advantages over traditional linear estimation by nonadaptive… (More)

We attempt to recover a function of unknown smoothness from noisy, sampled data. We i n troduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coeecients. The thresholding is adaptive: a threshold level is assigned to each d y adic resolution level by the principle of minimizing the Stein Unbiased Estimate of Risk… (More)

Considerable eeort has been directed recently to develop asymptotically mini-max methods in problems of recovering innnite-dimensional objects (curves, densities , spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly-or exactly-minimax estimators being obtained for a variety of interesting problems.… (More)

We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which w orks in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coecients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel-and… (More)

We attempt to recover an n-dimensional vector observed in white noise, where n is large and the vector is known to be sparse, but the degree of sparsity is unknown. We consider three different ways of defining sparsity of a vector: using the fraction of nonzero terms; imposing power-law decay bounds on the ordered entries; and controlling the ℓ p norm for p… (More)

Density estimation is a commonly used test case for non-parametric estimation methods. We explore the asymptotic properties of estimators based on thresholding of empirical wavelet coeecients. Minimax rates of convergence are studied over a large range of Besov function classes B s;p;q and for a range of global L 0 p error measures, 1 p 0 < 1. A single… (More)

Consider estimating the mean vector from data N n (; 2 I) with l q norm loss, q 1, when is known to lie in an n-dimensional l p ball, p 2 (0; 1). For large n, the ratio of minimax linear risk to minimax risk can be arbitrarily large if p < q. Obvious exceptions aside, the limiting ratio equals 1 only if p = q = 2. Our arguments are mostly indirect,… (More)

WaveLab is a library of Matlab routines for wavelet analysis, wavelet-packet analysis, cosine-packet analysis and matching pursuit. The library is available free of charge over the Internet. Versions are provided for Macintosh, UNIX and Windows machines. WaveLab makes available, in one package, all the code to reproduce all the gures in our published… (More)

- David L. Donoho, Iain Johnstone, Andrea Montanari
- IEEE Transactions on Information Theory
- 2013

Compressed sensing posits that, within limits, one can undersample a sparse signal and yet reconstruct it accurately. Knowing the precise limits to such undersampling is important both for theory and practice. We present a formula that characterizes the allowed undersampling of generalized sparse objects. The formula applies to approximate message passing… (More)