• Corpus ID: 7963781

Component selection and smoothing in smoothing spline analysis of variance models -- COSSO

@inproceedings{Lin2003ComponentSA,
  title={Component selection and smoothing in smoothing spline analysis of variance models -- COSSO},
  author={Yi Lin and Hao Helen Zhang},
  year={2003}
}
We propose a new method for model selection and model fitting in nonparametric regression models, in the framework of smoothing spline ANOVA. The “COSSO” is a method of regularization with the penalty functional being the sum of component norms, instead of the squared norm employed in the traditional smoothing spline method. The COSSO provides a unified framework for several recent proposals for model selection in linear models and smoothing spline ANOVA models. Theoretical properties, such as… 

Figures and Tables from this paper

COSSO-type penalized likelihood method for simultaneous nonparametric regression and model selection in exponential Families
This paper extends the component selection and smoothing operator (COSSO), a nonparametric variable selection approach recently developed in Lin and Zhang (2002), to exponential families. We propose
Recursive identification of smoothing spline ANOVA models
In this paper we present a unified discussion of different approaches to identification of smoothing spline ANOVA models. The ‘classical’ approach to smoothing spline ANOVA models can be referred to
COMPONENT SELECTION AND SMOOTHING FOR NONPARAMETRIC REGRESSION IN EXPONENTIAL FAMILIES
TLDR
This work proposes a new penalized likelihood method for model selection and nonparametric regression in exponential families in the framework of smoothing spline ANOVA and shows that an equivalent formulation of the method leads naturally to an iterative algorithm.
Model Selection and Estimation in Generalized Additive Models and Generalized Additive Mixed Models.
TLDR
A method of model selection and estimation in generalized additive models (GAMs) for data from a distribution in the exponential family by maximizing the penalized quasi-likelihood with the adaptive LASSO to effectively select the important nonparametric functions.
Using recursive algorithms for the efficient identification of smoothing spline ANOVA models
TLDR
It is shown that SDR can be effectively combined with the “classical” approach to obtain a more accurate and efficient estimation of smoothing spline ANOVA models to be applied for emulation purposes.
Model selection and smoothing of mean and variance functions in nonparametric heteroscedastic regression
TLDR
A new multivariate nonparametric heteroscedastic regression procedure in the framework of smoothing spline analysis of variance (SS-ANOVA) based on COSSO like penalty, which allows to discover the sparse representation of the mean and the variance function when such sparsity exists.
Variable selection for multivariate smoothing splines with correlated random errors
TLDR
This work proposes some unified approaches to simultaneously select important variables, estimate the multivariate nonparametric function, and estimate the variance components in the framework of smoothing spline analysis of variance (SS-ANOVA), and develops efficient computational algorithms which solve the proposed methods by iteratively solving a quadratic programming (QP) problem and fitting a linear mixed effects model.
Robust spline-based variable selection in varying coefficient model
The varying coefficient model is widely used as an extension of the linear regression model. Many procedures have been developed for the model estimation, and recently efficient variable selection
Shrinkage Estimation of the Varying Coefficient Model
The varying coefficient model is a useful extension of the linear regression model. Nevertheless, how to conduct variable selection for the varying coefficient model in a computationally efficient
Smoothing splines are among the most popular methods for estimation of f 0 due to their good empirical performance and sound theoretical support ( Cox
TLDR
A new method for nonparametric function estimation is proposed, which allows for a more flexible estimation of the function in regions of the domain where it has more curvature, and establishes the optimal MSE convergence rate.
...
...

References

SHOWING 1-10 OF 43 REFERENCES
Smoothing spline ANOVA models for large data sets with Bernoulli observations and the randomized GACV
TLDR
A class of approximate numerical methods for solving the penalized likelihood variational problem which, in conjunction with the ranGACV method, allows the application of smoothing spline ANOVA models with Bernoulli data to much larger data sets than previously possible.
Variable Selection and Model Building via Likelihood Basis Pursuit
This article presents a nonparametric penalized likelihood approach for variable selection and model building, called likelihood basis pursuit (LBP). In the setting of a tensor product reproducing
Linear Smoothers and Additive Models
TLDR
It is shown that backfitting is the Gauss-Seidel iterative method for solving a set of normal equations associated with the additive model and conditions for consistency and nondegeneracy are provided and convergence is proved for the backfitting and related algorithms for a class of smoothers that includes cubic spline smoothers.
Theory & Methods: Spatially‐adaptive Penalties for Spline Fitting
The paper studies spline fitting with a roughness penalty that adapts to spatial heterogeneity in the regression function. The estimates are pth degree piecewise polynomials with p− 1 continuous
A GENERALIZED APPROXIMATE CROSS VALIDATION FOR SMOOTHING SPLINES WITH NON-GAUSSIAN DATA
TLDR
A Generalized Approximate Cross Validation function for estimating the smoothing parameter in the penalized log likeli- hood regression problem with non-Gaussian data and suggests that the GACV curve may be an approximately unbiased estimate of the Kullback-Leibler distance in the Bernoulli data case.
Regression Shrinkage and Selection via the Lasso
TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Tensor product space ANOVA models
  • Yi Lin
  • Mathematics, Computer Science
  • 2000
TLDR
The quick optimal rate of the TPS-ANOVA model makes it very preferable in high-dimensional function estimation, and many properties of the tensor product space of Sobolev-Hilbert spaces are given.
Bayesian Confidence Intervals for Smoothing Splines
Abstract The frequency properties of Wahba's Bayesian confidence intervals for smoothing splines are investigated by a large-sample approximation and by a simulation study. When the coverage
Diagnostics for Nonparametric Regression Models with Additive Terms
TLDR
This article proposes and illustrates some simple retrospective diagnostics to help data analysts in detecting possible aliasing effects in computed nonparametric fits and in building parsimonious models in an interactive fashion.
Bayesian Variable Selection and Model Averaging in High-Dimensional Multinomial Nonparametric Regression
This article presents a Bayesian method for estimating nonparametrically a highdimensional multinomial regression model. The regression functions are expressed as sums of main effects and
...
...