• Corpus ID: 235795687

The EAS approach to variable selection for multivariate response data in high-dimensional settings

@inproceedings{Koner2021TheEA,
  title={The EAS approach to variable selection for multivariate response data in high-dimensional settings},
  author={Salil Koner and Jonathan P. Williams},
  year={2021}
}
In this paper, we extend the epsilon admissible subsets (EAS) model selection approach, from its original construction in the high-dimensional linear regression setting, to an EAS framework for performing group variable selection in the high-dimensional multivariate regression setting. Assuming a matrix-Normal linear model we show that the EAS strategy is asymptotically consistent if there exists a sparse, true data generating set of predictors. Nonetheless, our EAS strategy is designed to… 

Tables from this paper

On the proof of posterior contraction for sparse generalized linear models with multivariate responses
TLDR
This paper provides a corrected proof of Theorems 3 and 4 of Bai and Ghosh (2018) and extends the MBSP model to multivariate generalized linear models (GLMs) and quantifies the posterior contraction rate at which the posterior shrinks around the true regression coefficients.
Discussion of “A Gibbs Sampler for a Class of Random Convex Polytopes”
An exciting new algorithmic breakthrough has been advanced for how to carry out inferences in a Dempster-Shafer (DS) formulation of a categorical data generating model. The developed sampling

References

SHOWING 1-10 OF 63 REFERENCES
Nonpenalized variable selection in high-dimensional linear model settings via generalized fiducial inference
TLDR
An entirely new perspective on variable selection is presented within a generalized fiducial inference framework and it is shown that the procedure very naturally assigns small probabilities to subsets of covariates which include redundancies by way of explicit $L_{0}$ minimization.
Bayesian Variable Selection Regression of Multivariate Responses for Group Data
TLDR
Two multivariate extensions of the Bayesian group lasso for variable selection and estimation for data with high dimensional predictors and multi-dimensional response variables are proposed and compared to state-of-the-art variable selection strategies on simulated data sets.
The EAS approach for graphical selection consistency in vector autoregression models
As evidenced by various recent and significant papers within the frequentist literature, along with numerous applications in macroeconomics, genomics, and neuroscience, there continues to be
Simultaneous Variable Selection
TLDR
A new method for selecting a common subset of explanatory variables where the aim is to model several response variables based on the (joint) residual sum of squares while constraining the parameter estimates to lie within a suitable polyhedral region is proposed.
Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
TLDR
A group-lasso type penalty is applied that treats each row of the matrix of the regression coefficients as a group and shows that this penalty satisfies certain desirable invariance properties of the reduced-rank regression coefficient matrix.
Confidence intervals for low dimensional parameters in high dimensional linear models
TLDR
The method proposed turns the regression data into an approximate Gaussian sequence of point estimators of individual regression coefficients, which can be used to select variables after proper thresholding, and demonstrates the accuracy of the coverage probability and other desirable properties of the confidence intervals proposed.
$χ^2$-confidence sets in high-dimensional regression
TLDR
It is shown that under l1-sparsity conditions on the regression coefficients β0 the square-root Lasso produces to a consistent estimator of the noise variance and this procedure leads to an asymptotically χ2-distributed pivot, with a remainder term depending only on the l 1-error of the initial estimator.
Common Subset Selection of Inputs in Multiresponse Regression
  • Timo Similä, J. Tikka
  • Computer Science
    The 2006 IEEE International Joint Conference on Neural Network Proceedings
  • 2006
TLDR
The multiresponse sparse regression algorithm is an input selection method for linearly parameterized models, which updates with carefully chosen step lengths, and competes favorably with other methods when many correlated inputs are available for model construction.
Dimension reduction and coefficient estimation in multivariate linear regression
Summary  We introduce a general formulation for dimension reduction and coefficient estimation in the multivariate linear model. We argue that many of the existing methods that are commonly used in
Bayesian variable selection with shrinking and diffusing priors
We consider a Bayesian approach to variable selection in the presence of high dimensional covariates based on a hierarchical model that places prior distributions on the regression coefficients as
...
...