• Corpus ID: 246063482

Inference in High-dimensional Multivariate Response Regression with Hidden Variables

@inproceedings{Bing2022InferenceIH,
  title={Inference in High-dimensional Multivariate Response Regression with Hidden Variables},
  author={Xin Bing and Wei Cheng and Huijie Feng and Yang Ning},
  year={2022}
}
This paper studies the inference of the regression coefficient matrix under multivariate response linear regressions in the presence of hidden variables. A novel procedure for constructing confidence intervals of entries of the coefficient matrix is proposed. Our method first utilizes the multivariate nature of the responses by estimating and adjusting the hidden effect to construct an initial estimator of the coefficient matrix. By further deploying a low-dimensional projection procedure to… 

References

SHOWING 1-10 OF 48 REFERENCES
Confidence intervals for low dimensional parameters in high dimensional linear models
TLDR
The method proposed turns the regression data into an approximate Gaussian sequence of point estimators of individual regression coefficients, which can be used to select variables after proper thresholding, and demonstrates the accuracy of the coverage probability and other desirable properties of the confidence intervals proposed.
Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
We develop uniformly valid confidence regions for regression coefficients in a high-dimensional sparse median regression model with homoscedastic errors. Our methods are based on a moment equation
Large covariance estimation by thresholding principal orthogonal complements
TLDR
It is shown that the effect of estimating the unknown factors vanishes as the dimensionality increases, and the principal orthogonal complement thresholding method ‘POET’ is introduced to explore such an approximate factor structure with sparsity.
An improved and explicit surrogate variable analysis procedure by coefficient adjustment.
TLDR
This paper proposes an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties and shows that the method is competitive to state-of-the-art methods.
Spectral Deconfounding via Perturbed Sparse Linear Models
TLDR
This model develops and investigates a class of methods that are based on running the Lasso on preprocessed data and shows that, under some assumptions, one can achieve the optimal $\ell_1$-error rate for estimating the underlying sparse coefficient vector.
Model selection and estimation in regression with grouped variables
Summary.  We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor
Prediction Under Latent Factor Regression: Adaptive PCR, Interpolating Predictors and Beyond
TLDR
This work proves a master theorem that establishes a risk bound for a large class of predictors, including the PCR predictor as a special case, and uses the main theorem to recover known risk bounds for the minimum-norm interpolating predictor and a prediction method tailored to a subclass of factor regression models with identifiable parameters.
Sufficient Forecasting Using Factor Models.
Doubly debiased lasso: High-dimensional inference under hidden confounding
TLDR
This work proposes the Doubly Debiased Lasso estimator for single components of the regression coefficient vector and establishes its asymptotic normality and also proves that it is efficient in the Gauss-Markov sense.
Latent variable graphical model selection via convex optimization
TLDR
The modeling framework can be viewed as a combination of dimensionality reduction and graphical modeling (to capture remaining statistical structure not attributable to the latent variables) and it consistently estimates both the number of hidden components and the conditional graphical model structure among the observed variables.
...
...