#### Filter Results:

#### Publication Year

2007

2016

#### Co-author

#### Publication Venue

Learn More

We consider rst the spline smoothing nonparametric estimation with variable smoothing parameter and arbitrary design density function and show that the corresponding equivalent kernel can be approximated by the Green's function of a certain linear diierential operator. Furthermore, we propose to use the standard (in applied mathematics and engineering)… (More)

We consider a problem of recovering a high-dimensional vector µ observed in white noise, where the unknown vector µ is assumed to be sparse. The objective of the paper is to develop a Bayesian formalism which gives rise to a family of l 0-type penalties. The penalties are associated with various choices of the prior distributions π n (·) on the number of… (More)

- BY FELIX ABRAMOVICH, VADIM GRINSHTEIN, ATHANASIA PETSA
- 2009

SUMMARY We consider the problem of estimating the unknown response function in the Gaussian white noise model. We first utilize the recently developed Bayesian maximum a posteriori testimation procedure of Abramovich et al. (2007) for recovering an unknown high-dimensional Gaussian mean vector. The existing results for its upper error bounds over various… (More)

We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the… (More)

We consider model selection in Gaussian regression, where the number of predictors might be even larger than the number of observations. The proposed procedure is based on penalized least square criteria with a complexity penalty on a model size. We discuss asymptotic properties of the resulting estimators corresponding to linear and so-called 2k… (More)

We consider model selection in generalized linear models (GLM) for high-dimensional data and propose a wide class of model selection criteria based on penalized maximum likelihood with a complexity penalty on the model size. We derive a general nonasymptotic upper bound for the Kullback-Leibler risk of the resulting estimators and establish the… (More)

- BY FELIX ABRAMOVICH, VADIM GRINSHTEIN, F. ABRAMOVICH
- 2013

SUMMARY We consider estimating a sparse group of sparse normal mean vectors, based on penalized likelihood estimation with complexity penalties on the number of nonzero mean vectors and the numbers of their significant components, which can be performed by a fast algorithm. The resulting estimators are developed within a Bayesian framework and can be viewed… (More)

We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the… (More)

- ‹
- 1
- ›