A weight-relaxed model averaging approach for high-dimensional generalized linear models
@article{Ando2017AWM, title={A weight-relaxed model averaging approach for high-dimensional generalized linear models}, author={Tomohiro Ando and Ker-Chau Li}, journal={Annals of Statistics}, year={2017}, volume={45}, pages={2654-2679} }
Model averaging has long been proposed as a powerful alternative to model selection in regression analysis. However, how well it performs in high-dimensional regression is still poorly understood. Recently, Ando and Li [ J. Amer. Statist. Assoc. 109 (2014) 254–265] introduced a new method of model averaging that allows the number of predictors to increase as the sample size increases. One notable feature of Ando and Li’s method is the relaxation on the total model weights so that weak signals…
53 Citations
From Model Selection to Model Averaging: A Comparison for Nested Linear Models
- Computer Science
- 2022
A more general nested framework, heteroscedastic and autocorrelated random errors, and sparse coefficients are allowed in the current paper, which is more common in practice.
Optimal model averaging for single-index models with divergent dimensions
- Economics, Computer Science
- 2021
A model-averaging estimator based on cross-validation, which allows the dimension of covariates and the number of candidate models to increase with the sample size and is asymptotically optimal when all candidate models are misspecified.
Multifold Cross-Validation Model Averaging for Generalized Additive Partial Linear Models
- Computer ScienceJournal of Computational and Graphical Statistics
- 2023
A variable importance measure is proposed to quantify the importances of the predictors in GAPLMs based on the MA weights to be able to asymptotically identify the variables in the true model.
Robustness of model averaging methods for the violation of standard linear regression assumptions
- Computer ScienceCommunications for Statistical Applications and Methods
- 2021
Simulations showed that the stacking method tends to give better performance than BMA or standard linear regression analysis (including the stepwise selection method) in the sense of risks or prediction error when typical linear regression assumptions are violated.
Parsimonious Model Averaging With a Diverging Number of Parameters
- Computer ScienceJournal of the American Statistical Association
- 2020
It is proved that the proposed procedure is asymptotically optimal in the sense that its squared prediction loss and risk are asymPTotically identical to those of the best—but infeasible—model averaging estimator.
Model Averaging for Nonlinear Regression Models
- MathematicsJournal of Business & Economic Statistics
- 2021
Abstract This article considers the problem of model averaging for regression models that can be nonlinear in their parameters and variables. We consider a nonlinear model averaging (NMA) framework…
Optimal Model Averaging for Semiparametric Partially Linear Models with Censored Data
- MathematicsMathematics
- 2023
In the past few decades, model averaging has received extensive attention, and has been regarded as a feasible alternative to model selection. However, this work is mainly based on parametric model…
Model Averaging for Support Vector Machine by Cross-Validation
- Computer Science
- 2021
This paper advocates model averaging as an alternative approach, where estimates obtained from different models are combined in a weighted average, and proposes a model weighting scheme and provides the theoretical underpinning for the proposed method.
AdaBoost Semiparametric Model Averaging Prediction for Multiple Categories
- MathematicsJournal of the American Statistical Association
- 2020
This article studies variation in model average techniques in parametric models and continuous responses for model-based prediction in discrete-time models.
References
SHOWING 1-10 OF 40 REFERENCES
A Model-Averaging Approach for High-Dimensional Regression
- Computer Science
- 2014
A model-averaging procedure for high-dimensional regression problems in which the number of predictors p exceeds the sample size n is developed, and a theorem is proved, showing that delete-one cross-validation achieves the lowest possible prediction loss asymptotically.
Minimum Mean Squared Error Model Averaging in Likelihood Models
- Mathematics, Economics
- 2015
A data-driven method for frequentist model averaging weight choice is developed for general likelihood models. We propose to estimate the weights which minimize an estimator of the mean squared error…
Bayesian Model Averaging for Linear Regression Models
- Mathematics
- 1997
Abstract We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the…
Variable selection in high-dimensional linear models: partially faithful distributions and the PC-simple algorithm
- Computer Science
- 2010
A simplified version of the pc algorithm is developed, which is computationally feasible even with thousands of covariates and provides consistent variable selection under conditions on the random design matrix that are of a different nature than coherence conditions for penalty-based approaches like the lasso.
Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models
- Mathematics
- 2013
It has been shown that Akaike information criterion (AIC)-type criteria are asymptotically efficient selectors of the tuning parameter in nonconcave penalized regression methods under the assumption…
Regression Shrinkage and Selection via the Lasso
- Computer Science
- 1996
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Frequentist Model Average Estimators
- Mathematics, Economics
- 2003
The traditional use of model selection methods in practice is to proceed as if the final selected model had been chosen in advance, without acknowledging the additional uncertainty introduced by…
The Focused Information Criterion
- Mathematics
- 2003
A variety of model selection criteria have been developed, of general and specific types. Most of these aim at selecting a single model with good overall properties, for example, formulated via…
Model selection principles in misspecified models
- Computer Science, Mathematics
- 2014
Novel asymptotic expansions of the Bayesian principle and the Kullback–Leibler divergence principle are derived in misspecified generalized linear models, which give the generalized Bayesian information criterion and generalized Akaike information criterion.