Risk bounds for model selection via penalization

@article{Barron1999RiskBF,
  title={Risk bounds for model selection via penalization},
  author={Andrew R. Barron and L. Birge and Pascal Massart},
  journal={Probability Theory and Related Fields},
  year={1999},
  volume={113},
  pages={301-413}
}
Abstract Performance bounds for criteria for model selection are developed using recent theory for sieves. The model selection criteria are based on an empirical loss or contrast function with an added penalty term motivated by empirical process theory and roughly proportional to the number of parameters needed to describe the model divided by the number of observations. Most of our examples involve density or regression estimation settings and we focus on the problem of estimating the unknown… 
Risk of penalized least squares, greedy selection andl 1-penalization for flexible function libraries
For function estimation using penalized squared error criteria, we derive generally applicable risk bounds, showing the balance of accuracy of approximation and penalty relative to the sample size.
Model Selection and Error Estimation
TLDR
A tight relationship between error estimation and data-based complexity penalization is pointed out: any good error estimate may be converted into a data- based penalty function and the performance of the estimate is governed by the quality of the error estimate.
Adaptive nonparametric instrumental regression by model selection
We consider the problem of estimating the structural function in nonparametric instrumental regression, where in the presence of an instrument W a response Y is modeled in dependence of an endogenous
MODEL SELECTION FOR NONPARAMETRIC REGRESSION
Risk bounds are derived for regression estimation based on model selec- tion over an unrestricted number of models. While a large list of models provides more flexibility, significant selection bias
Model Selection for Nonparametric Regression Model Selection for Regression
Risk bounds are derived for regression estimation based on model selection over a unrestricted number of models. While a large list of models provides more exibility, sig-niicant selection bias may
Minimal Penalties for Gaussian Model Selection
This paper is mainly devoted to a precise analysis of what kind of penalties should be used in order to perform model selection via the minimization of a penalized least-squares type criterion within
Gaussian model selection
Abstract.Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and
An Asymptotic Property of Model Selection Criteria
TLDR
It is shown that the optimal rate of convergence is simultaneously achieved for log-densities in Sobolev spaces W/sub 2//sup s/(U) without knowing the smoothness parameter s and norm parameter U in advance.
Adaptive Model Selection Using Empirical Complexities
Given n independent replicates of a jointly distributed pair (X,Y) in Rd times R, we wish to select from a fixed sequence of model classes F1,F2... a deterministic prediction rule f: Rd to R whose
Model selection for regression on a random design
We consider the problem of estimating an unknown regression function when the design is random with values in . Our estimation procedure is based on model selection and does not rely on any prior
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 81 REFERENCES
An Asymptotic Property of Model Selection Criteria
TLDR
It is shown that the optimal rate of convergence is simultaneously achieved for log-densities in Sobolev spaces W/sub 2//sup s/(U) without knowing the smoothness parameter s and norm parameter U in advance.
Model selection for regression on a fixed design
Abstract. We deal with the problem of estimating some unknown regression function involved in a regression framework with deterministic design points. For this end, we consider some collection of
From Model Selection to Adaptive Estimation
Many different model selection information criteria can be found in the literature in various contexts including regression and density estimation. There is a huge amount of literature concerning
On the Estimation of a Probability Density Function by the Maximum Penalized Likelihood Method
Abstract : A class of probability density estimates can be obtained by penalizing the likelihood by a functional which depends on the roughness of the logarithm of the density. The limiting case of
Adaptive Spline Estimates for Nonparametric Regression Models
where the are independent standart Gaussian random variables, while the regressors x are deterministic and equally spaced, i.e., x (2i-1)/(2n). We suppose that the unknown function f(.) is
Rates of convergence for minimum contrast estimators
SummaryWe shall present here a general study of minimum contrast estimators in a nonparametric setting (although our results are also valid in the classical parametric case) for independent
The behavior of maximum likelihood estimates under nonstandard conditions
This paper proves consistency and asymptotic normality of maximum likelihood (ML) estimators under weaker conditions than usual. In particular, (i) it is not assumed that the true distribution
Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications
  • D. Haussler
  • Computer Science, Mathematics
    Inf. Comput.
  • 1992
TLDR
Theorems on the uniform convergence of empirical loss estimates to true expected loss rates for certain hypothesis spaces H are given, and it is shown how this implies learnability with bounded sample size, disregarding computational complexity.
Minimax risk overlp-balls forlp-error
SummaryConsider estimating the mean vector θ from dataNn(θ,σ2I) withlq norm loss,q≧1, when θ is known to lie in ann-dimensionallp ball,p∈(0, ∞). For largen, the ratio of minimaxlinear risk to minimax
Wavelet Shrinkage: Asymptopia?
Much recent effort has sought asymptotically minimax methods for recovering infinite dimensional objects-curves, densities, spectral densities, images-from noisy data. A now rich and complex body of
...
1
2
3
4
5
...