• Publications
  • Influence
Nearly unbiased variable selection under minimax concave penalty
We propose MC+, a fast, continuous, nearly unbiased and accurate method of penalized variable selection in high-dimensional linear regression. The LASSO is fast and continuous, but biased. The biasExpand
Confidence intervals for low dimensional parameters in high dimensional linear models
type="main" xml:id="rssb12026-abs-0001"> The purpose of this paper is to propose methodologies for statistical inference of low dimensional parameters with high dimensional data. We focus onExpand
Scaled sparse linear regression
Scaled sparse linear regression jointly estimates the regression coefficients and noise level in a linear model. It chooses an equilibrium with a sparse regression method by iteratively estimatingExpand
The sparsity and bias of the Lasso selection in high-dimensional linear regression
Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent,Expand
Adaptive Lasso for sparse high-dimensional regression models
We study the asymptotic properties of the adaptive Lasso estimators in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We considerExpand
Optimal rates of convergence for covariance matrix estimation
Covariance matrix plays a central role in multivariate statistical analysis. Significant advances have been made recently on developing both theory and methodology for estimating large covarianceExpand
A group bridge approach for variable selection
TLDR
The proposed group bridge approach is a penalized regularization method that uses a specially designed group bridge penalty that has the oracle group selection property, in that it can correctly select important groups with probability converging to one. Expand
A General Theory of Concave Regularization for High-Dimensional Sparse Estimation Problems
TLDR
A general theoretical framework is presented showing that under appropriate conditions, the global solution of nonconvex regularization leads to desirable recovery performance and corresponds to the unique sparse local solution, which can be obtained via different numerical procedures. Expand
On Tensor Completion via Nuclear Norm Minimization
TLDR
A convex optimization approach to tensor completion is investigated by directly minimizing a tensor nuclear norm and it is proved that this leads to an improved sample size requirement and a series of algebraic and probabilistic techniques, which may be of independent interests and could be useful in other tensor-related problems. Expand
One Permutation Hashing
TLDR
One permutation hashing can achieve similar (or even better) accuracies compared to the k-permutation scheme, and the experiments with training SVM and logistic regression confirm that this one permutation scheme should perform similarly to the original (k-permutations) minwise hashing. Expand
...
1
2
3
4
5
...