• Publications
  • Influence
Local polynomial modelling and its applications
Introduction. Overview of Existing Methods. Framework for Local Polynomial regression. Automatic Determination of Model Complexity. .Applications of Local Polynomial Modeling. Applications inExpand
Large Covariance Estimation by Thresholding Principal Orthogonal Complements.
TLDR
The Principal Orthogonal complEment Thresholding (POET) method is introduced to explore such an approximate factor structure with sparsity and provides mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. Expand
High Dimensional Classification Using Features Annealed Independence Rules.
TLDR
The conditions under which all the important features can be selected by the two-sample t-statistic are established and the choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Expand
Test of Significance Based on Wavelet Thresholding and Neyman's Truncation
Abstract Traditional nonparametric tests, such as the Kolmogorov—Smirnov test and the Cramer—Von Mises test, are based on the empirical distribution functions. Although these tests possess root-nExpand
Data‐Driven Bandwidth Selection in Local Polynomial Fitting: Variable Bandwidth and Spatial Adaptation
When estimating a mean regression function and its derivatives, locally weighted least squares regression has proven to be a very attractive technique. The present paper focuses on the importantExpand
Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
TLDR
This work shows that with general nonparametric models, under some mild technical conditions, the proposed independence screening methods have a sure screening property and the extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. Expand
A Selective Overview of Variable Selection in High Dimensional Feature Space.
TLDR
A brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection is presented and the properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. Expand
NETWORK EXPLORATION VIA THE ADAPTIVE LASSO AND SCAD PENALTIES.
TLDR
Non-concave penalties and the adaptive LASSO penalty are introduced to attenuate the bias problem in the network estimation to solve the problem of precision matrix estimation. Expand
...
1
2
3
4
5
...