The main idea is to introduce the fundamental concepts of the theory while maintaining the exposition suitable for a first approach in the field, and many important and useful results on optimal and adaptive estimation are provided.Expand

We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk… Expand

A new nuclear norm penalized estimator of A_0 is proposed and a general sharp oracle inequality for this estimator is established for arbitrary values of $n,m_1,m-2$ under the condition of isometry in expectation to find the best trace regression model approximating the data.Expand

Discriminant analysis for two data sets in IRd with probability densities f and g can be based on the estimation of the set G = {x : f(x) I g(x)}. We consider applications where it is appropriate to… Expand

This work constructs plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n -1 , and establishes minimax lower bounds showing that the obtained rates cannot be improved.Expand

Image processing is an increasingly important area of research and there exists a large variety of image reconstruction methods proposed by different authors. This book is concerned with a technique… Expand

This work introduces a new estimation procedure, called Exponential Screening, that shows remarkable adaptation properties and shows that it solves optimally and simultaneously all the problems of aggregation in Gaussian regression that have been discussed in the literature.Expand

The Group Lasso can achieve an improvement in the prediction and estimation properties as compared to the Lasso, and it is proved that the rate of convergence of the upper bounds is optimal in a minimax sense.Expand

The notion of optimal rate of aggregation is defined in an abstract context and lower bounds valid for any method of aggregation are proved, thus establishing optimal rates of linear, convex and model selection type aggregation.Expand

It is shown that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of non-zero components of the oracle vector, in nonparametric regression setting with random design.Expand