Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
@article{Bunea2011JointVA, title={Joint variable and rank selection for parsimonious estimation of high-dimensional matrices}, author={Florentina Bunea and Yiyuan She and Marten H. Wegkamp}, journal={Annals of Statistics}, year={2011}, volume={40}, pages={2359-2388} }
We propose dimension reduction methods for sparse, high-dimensional multivariate response regression models. Both the number of responses and that of the predictors may exceed the sample size. Sometimes viewed as complementary, predictor selection and rank reduction are the most popular strategies for obtaining lower-dimensional approximations of the parameter matrix in such models. We show in this article that important gains in prediction accuracy can be obtained by considering them jointly…
100 Citations
Reduced-rank Regression in Sparse Multivariate Varying-Coefficient Models with High-dimensional Covariates
- Mathematics
- 2013
In genetic studies, not only can the number of predictors obtained from microarray measurements be extremely large, there can also be multiple response variables. Motivated by such a situation, we…
Bayesian sparse multiple regression for simultaneous rank reduction and variable selection.
- Computer ScienceBiometrika
- 2020
A carefully devised shrinkage prior is considered on the matrix of regression coefficients which obviates the need to specify a prior on the rank, and shrinks the regression matrix towards low-rank and row-sparse structures.
Parametric and semiparametric reduced-rank regression with flexible sparsity
- Computer Science, MathematicsJ. Multivar. Anal.
- 2015
Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- MathematicsThe Annals of Statistics
- 2019
We consider the multivariate response regression problem with a regression coefficient matrix of low, unknown rank. In this setting, we analyze a new criterion for selecting the optimal reduced rank.…
Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
- Mathematics, Computer ScienceJ. Multivar. Anal.
- 2017
Adaptive Estimation in Two-way Sparse Reduced-rank Regression
- Computer Science
- 2014
It is shown that the proposed estimator achieves near optimal non-asymptotic minimax rates of estimation under a collection of squared Schatten norm losses simultaneously by providing both the error bounds for the estimator and minimax lower bounds.
Adaptive Sparse Reduced-rank Regression
- Computer Science
- 2014
A new estimation scheme is proposed, which achieves competitive numerical performance while significantly reducing computation time when compared with state-of-the-art methods and achieves near optimal non-asymptotic minimax rates of estimation under a collection of squared Schatten norm losses simultaneously.
Dimensionality Reduction and Variable Selection in Multivariate Varying-Coefficient Models With a Large Number of Covariates
- Mathematics
- 2018
ABSTRACT Motivated by the study of gene and environment interactions, we consider a multivariate response varying-coefficient model with a large number of covariates. The need of nonparametrically…
Best Subset Selection in Reduced Rank Regression
- Computer ScienceArXiv
- 2022
A novel selection scheme to directly identify the best subset of predictors via a primal dual formulation is proposed and a computational efficient algorithm that can be scalable to high-dimensional data with guaranteed convergence is developed.
Sparse reduced-rank regression with covariance estimation
- Computer ScienceStatistics and Computing
- 2014
This work develops a numerical algorithm to solve the penalized regression problem and proposes to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by use a similar penalty on the precision matrix.
References
SHOWING 1-10 OF 37 REFERENCES
Dimension reduction and coefficient estimation in multivariate linear regression
- Mathematics
- 2007
Summary We introduce a general formulation for dimension reduction and coefficient estimation in the multivariate linear model. We argue that many of the existing methods that are commonly used in…
Optimal selection of reduced rank estimators of high-dimensional matrices
- Mathematics, Computer Science
- 2011
A new criterion, the Rank Selection Criterion (RSC), is introduced, for selecting the optimal reduced rank estimator of the coefficient matrix in multivariate response regression models, which has very low computational complexity, linear in the number of candidate models, making it particularly appealing for large scale problems.
Low rank Multivariate regression
- Mathematics, Computer Science
- 2010
This paper proposes a criterion to select among a family of low rank estimators and proves a non-asymptotic oracle inequality for the resulting estimator and investigates the easier case where the variance of the noise is known.
Estimation of high-dimensional low-rank matrices
- Mathematics, Computer Science
- 2010
This work investigates penalized least squares estimators with a Schatten-p quasi-norm penalty term and derives bounds for the kth entropy numbers of the quasi-convex Schatten class embeddings S M p → S M 2 , p < 1, which are of independent interest.
Oracle Inequalities and Optimal Inference under Group Sparsity
- Computer Science, Mathematics
- 2010
The Group Lasso can achieve an improvement in the prediction and estimation properties as compared to the Lasso, and it is proved that the rate of convergence of the upper bounds is optimal in a minimax sense.
Sparse principal component analysis via regularized low rank matrix approximation
- Computer Science
- 2008
Consistent group selection in high-dimensional linear regression.
- Computer Science, MathematicsBernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability
- 2010
An adaptive group Lasso method is proposed which is a generalization of the adaptive Lasso and requires an initial estimator to improve the selection results and is shown to be consistent in group selection under certain conditions.
Nuclear norm penalization and optimal rates for noisy low rank matrix completion
- Computer Science, Mathematics
- 2010
A new nuclear norm penalized estimator of A_0 is proposed and a general sharp oracle inequality for this estimator is established for arbitrary values of $n,m_1,m-2$ under the condition of isometry in expectation to find the best trace regression model approximating the data.
Model selection and estimation in regression with grouped variables
- Mathematics
- 2006
Summary. We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor…