• Corpus ID: 214794994

Relaxing the Gaussian assumption in Shrinkage and SURE in high dimension.

@article{Fathi2020RelaxingTG,
  title={Relaxing the Gaussian assumption in Shrinkage and SURE in high dimension.},
  author={Max Fathi and Larry Goldstein and Gesine Reinert and Adrien Saumard},
  journal={arXiv: Statistics Theory},
  year={2020}
}
Shrinkage estimation is a fundamental tool of modern statistics, pioneered by Charles Stein upon the discovery of his famous paradox. Despite a large subsequent literature, the efficiency of shrinkage, and the associated procedure known as Stein's Unbiased Risk Estimate, or SURE, has mainly been analysed in the Gaussian setting. Importing tools developed for use in the probabilistic area now known as Stein's method, the present work investigates the domain of validity of shrinkage and SURE away… 
Stein’s Method Meets Statistics: A Review of Some Recent Developments
Stein’s method is a collection of tools for analysing distributional comparisons through the study of a class of linear operators called Stein operators. Originally studied in probability, Stein’s
High-Dimensional Multi-Task Averaging and Application to Kernel Mean Embedding
TLDR
An improved estimator for the multi-task averaging problem, whose goal is the joint estimation of the means of multiple distributions using separate, independent data sets, and it is proved theoretically that this approach provides a reduction in mean squared error.
STEIN’S METHOD OF NORMAL APPROXIMATION: SOME RECOLLECTIONS AND REFLECTIONS BY LOUIS
This paper is a short exposition of Stein’s method of normal approximation from my personal perspective. It focuses mainly on the characterization of the normal distribution and the construction of
Stein’s method of normal approximation: Some recollections and reflections
This paper is a short exposition of Stein’s method of normal approximation from my personal perspective. It focuses mainly on the characterization of the normal distribution and the construction of
Zero Bias Enchanced Stein Couplings
The Stein couplings of Chen and Roellin [6] vastly expanded the range of applications for which coupling constructions in Stein’s method for normal approximation could be applied, and subsumed both
Stein's Method Meets Computational Statistics: A Review of Some Recent Developments
Stein’s method compares probability distributions through the study of a class of linear operators called Stein operators. While mainly studied in probability and used to underpin theoretical

References

SHOWING 1-10 OF 65 REFERENCES
Generalized SURE for Exponential Families: Applications to Regularization
TLDR
A regularized SURE objective is proposed, and its use in the context of wavelet denoising is demonstrated, and a new method for choosing regularization parameters in penalized LS estimators is suggested.
Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising
TLDR
This work derives generalized Stein's unbiased risk estimation formulas that hold for any spectral estimators which shrink or threshold the singular values of the data matrix, which leads to new data-driven spectral estimator estimators, whose optimality is discussed using tools from random matrix theory and through numerical experiments.
Adapting to Unknown Smoothness via Wavelet Shrinkage
Abstract We attempt to recover a function of unknown smoothness from noisy sampled data. We introduce a procedure, SureShrink, that suppresses noise by thresholding the empirical wavelet
Estimation with Quadratic Loss
It has long been customary to measure the adequacy of an estimator by the smallness of its mean squared error. The least squares estimators were studied by Gauss and by other authors later in the
Estimating LASSO Risk and Noise Level
TLDR
To the best of the knowledge, this result is the first that provides an asymptotically consistent risk estimator for the LASSO solely based on data and it is demonstrated through simulations that the variance estimation outperforms several existing methods in the literature.
The LASSO Risk for Gaussian Matrices
TLDR
This result is the first rigorous derivation of an explicit formula for the asymptotic mean square error of the LASSO for random instances and is based on the analysis of AMP, a recently developed efficient algorithm that is inspired from graphical model ideas.
Unbiased Risk Estimates for Singular Value Thresholding and Spectral Estimators
TLDR
The utility of the unbiased risk estimation for SVT-based denoising of real clinical cardiac MRI series data is demonstrated and an unbiased risk estimate formula for singular value thresholding (SVT), a popular estimation strategy that applies a soft-thresholding rule to the singular values of the noisy observations is given.
Adaptive denoising based on SURE risk
TLDR
The results indicated that the proposed method is very effective in adaptively finding the optimal solution in a mean square error (MSE) sense and it is shown that this method gives better MSE performance than those conventional wavelet shrinkage methods.
INADMISSIBILITY OF THE USUAL ESTIMATOR FOR THE MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION
If one observes the real random variables Xi, X,, independently normally distributed with unknown means ti, *, {n and variance 1, it is customary to estimate (i by Xi. If the loss is the sum of
...
...