# Generalized Additive Model Selection

@article{Chouldechova2015GeneralizedAM, title={Generalized Additive Model Selection}, author={Alexandra Chouldechova and Trevor J. Hastie}, journal={arXiv: Machine Learning}, year={2015} }

We introduce GAMSEL (Generalized Additive Model Selection), a penalized likelihood approach for fitting sparse generalized additive models in high dimension. Our method interpolates between null, linear and additive models by allowing the effect of each variable to be estimated as being either zero, linear, or a low-complexity curve, as determined by the data. We present a blockwise coordinate descent procedure for efficiently optimizing the penalized likelihood objective over a dense grid of…

## Figures and Tables from this paper

## 59 Citations

Bayesian Generalized Additive Model Selection Including a Fast Variational Option

- Computer Science
- 2022

This approach allows for the effects of continuous predictors to be categorized as either zero, linear or non-linear, and means field variational algorithms with closed form updates are obtained, which enhances scalability to very large data sets.

Generalized Sparse Additive Models

- Computer Science
- 2019

This work presents a unified framework for estimation and analysis of generalized additive models in high dimensions, encompassing many existing methods, and proves minimax optimal convergence bounds for this class under a weak compatibility condition.

Sparse Partially Linear Additive Models

- Computer ScienceArXiv
- 2014

The sparse partially linear additive model (SPLAM) is introduced, which combines model fitting and both of these model selection challenges into a single convex optimization problem and can outperform other methods across a broad spectrum of statistical regimes, including the high-dimensional (p ≫ N) setting.

Reluctant generalized additive modeling

- Computer Science
- 2019

A multi-stage algorithm, called RGAM, that can fit sparse generalized additive models at scale and is guided by the principle that, if all else is equal, one should prefer a linear feature over a non-linear feature.

Partially Linear Additive Gaussian Graphical Models

- Computer Science, MathematicsICML
- 2019

This work proposes a partially linear additive Gaussian graphical model (PLA-GGM) for the estimation of associations between random variables distorted by observed confounders, demonstrating superior performance compared to competing methods.

The information detection for the generalized additive model

- Computer Science
- 2020

An algorithm is developed to search the important regressor functions and their related structures through the introduction of basis functions with the Lasso-type penalized scheme and the performance is evaluated under simulation studies and real data analyses.

Ultrahigh‐dimensional generalized additive model: Unified theory and methods

- Computer ScienceScandinavian Journal of Statistics
- 2021

This article studied a two step selection and estimation method for ultra high dimensional generalized additive models and the adaptive group lasso estimator is shown to be selection consistent with improved convergence rates.

Group selection and shrinkage with application to sparse semiparametric modeling

- Computer Science
- 2021

A class of group-sparse estimators that combine group subset selection with group lasso or ridge shrinkage are introduced, and their efficacy in modeling supermarket foot traffic and economic recessions using many predictors is demonstrated.

Variable Selection for Additive Models with Missing Response at Random

- Computer Science, Mathematics
- 2017

Two new imputed estimating equation methods are proposed to implement the variable selection for the additive models with missing response at random by using the smooth-threshold estimating equation and it is shown that the resulting estimators enjoy the oracle property.

Introducing the GAMSELECT Procedure for Generalized Additive Model Selection

- Computer Science
- 2020

Examples are provided that demonstrate how PROC GAM SELECT enables you to control both the complexity and smoothness of the model fit and how the selection methods supported by PROC GAMSELECT compare to alternative modeling approaches supported by related procedures in SAS software.

## References

SHOWING 1-10 OF 18 REFERENCES

Sparse Partially Linear Additive Models

- Computer ScienceArXiv
- 2014

The sparse partially linear additive model (SPLAM) is introduced, which combines model fitting and both of these model selection challenges into a single convex optimization problem and can outperform other methods across a broad spectrum of statistical regimes, including the high-dimensional (p ≫ N) setting.

Regularization Paths for Generalized Linear Models via Coordinate Descent.

- Computer ScienceJournal of statistical software
- 2010

In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.

High-dimensional additive modeling

- Computer Science
- 2009

A computationally efficient algorithm, with provable numerical convergence properties, is presented, for optimizing the penalized likelihood of a new sparsity-smoothness penalty for high-dimensional generalized additive models.

Component selection and smoothing in multivariate nonparametric regression

- Computer Science, Mathematics
- 2006

A detailed analysis reveals that the COSSO does model selection by applying a novel soft thresholding type operation to the function components, which leads naturally to an iterative algorithm.

Regression Shrinkage and Selection via the Lasso

- Computer Science
- 1996

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Fast stable restricted maximum likelihood and marginal likelihood estimation of semiparametric generalized linear models

- Mathematics
- 2011

Summary. Recent work by Reiss and Ogden provides a theoretical basis for sometimes preferring restricted maximum likelihood (REML) to generalized cross‐validation (GCV) for smoothing parameter…

SpAM: Sparse Additive Models

- Computer ScienceNIPS
- 2007

A statistical analysis of the properties of SpAM and empirical results on synthetic and real data show that SpAM can be effective in fitting sparse nonparametric models in high dimensional data.

Model selection and estimation in regression with grouped variables

- Mathematics
- 2006

Summary. We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor…

Strong rules for discarding predictors in lasso‐type problems

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2012

This work proposes strong rules for discarding predictors in lasso regression and related problems, that are very simple and yet screen out far more predictors than the SAFE rules, and derives conditions under which they are foolproof.