Share This Author
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
One-step Sparse Estimates in Nonconcave Penalized Likelihood Models.
A new unified algorithm based on the local linear approximation for maximizing the penalized likelihood for a broad class of concave penalty functions and shows that if the regularization parameter is appropriately chosen, the one-step LLA estimates enjoy the oracle properties with good initial estimators.
Design and Modeling for Computer Experiments
This book discusses models for computer experiments, design techniques, and some concepts in Experimental Design Computer Experiments.
Feature Screening via Distance Correlation Learning
A numerical comparison indicates that theDC-SIS performs much better than the SIS in various models, and the implementation of the DC-S IS does not require model specification for responses or predictors, which is a very appealing property in ultrahigh-dimensional data analysis.
Variable Selection for Cox's proportional Hazards Model and Frailty Model
A class of variable selection procedures for parametric models via nonconcave penalized likelihood was proposed in Fan and Li (2001a). It has been shown there that the resulting procedures perform as…
Efficient Estimation and Inferences for Varying-Coefficient Models
Abstract This article deals with statistical inferences based on the varying-coefficient models proposed by Hastie and Tibshirani. Local polynomial regression techniques are used to estimate…
Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Liping Zhu, Lexin Li, Runze Li, Lixing Zhu
- Computer ScienceJournal of the American Statistical Association
- 1 December 2011
It is demonstrated that, with the number of predictors growing at an exponential rate of the sample size, the proposed procedure possesses consistency in ranking, which is both useful in its own right and can lead to consistency in selection.
Tuning parameter selectors for the smoothly clipped absolute deviation method.
This work shows that the commonly used the generalised crossvalidation cannot select the tuning parameter satisfactorily, with a nonignorable overfitting effect in the resulting model, and proposes a bic tuning parameter selector, which is shown to be able to identify the true model consistently.
Variable Selection using MM Algorithms.
This article proposes a new class of algorithms for finding a maximizer of the penalized likelihood for a broad class of penalty functions and proves that when these MM algorithms converge, they must converge to a desirable point.
New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis
Semiparametric regression models are very useful for longitudinal data analysis. The complexity of semiparametric models and the structure of longitudinal data pose new challenges to parametric…