TESTING CONSTANCY OF CONDITIONAL VARIANCE IN HIGH DIMENSION

@article{Deng2020TESTINGCO,
  title={TESTING CONSTANCY OF CONDITIONAL VARIANCE IN HIGH DIMENSION},
  author={Lu Deng and Changliang Zou and Zhaojun Wang and Xin Chen},
  journal={Statistica Sinica},
  year={2020}
}
Testing constancy of conditional covariance matrix is a fundamental problem. Deviation from this assumption would result in severely inefficient estimate. In this article, we propose a slice-based procedure to test constant conditional variance in cases where the data dimension is larger than the sample size. We develop a high-order correction that makes the test statistic robust with respect to high dimensionality, and show that the proposed test statistic is asymptotically normal under some… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 15 REFERENCES
TWO-SAMPLE BEHRENS-FISHER PROBLEM FOR HIGH-DIMENSIONAL DATA
This article is concerned with the two-sample Behrens-Fisher problem in high-dimensional settings. A novel test is proposed that is scale-invariant, asymp- totically normal under certain mild
Dynamic Covariance Models
TLDR
A new uniform consistency theory is highlighted in which the sample size can be seen as n4/5 when the bandwidth parameter is chosen as h∝n− 1/5 for accounting for the dynamics, and it is shown that this result holds uniformly over a range of the variable used for modeling the dynamics.
EFFECT OF HIGH DIMENSION: BY AN EXAMPLE OF A TWO SAMPLE PROBLEM
With the rapid development of modern computing techniques, statisticians are dealing with data with much higher dimension. Consequently, due to their loss of accuracy or power, some classical
Testing constancy of the error covariance matrix in vector models against parametric alternatives using a spectral decomposition
I consider multivariate (vector) time series models in which the error covariance matrix may be time-varying. I derive a test of constancy of the error covariance matrix against the alternative that
Tests alternative to higher criticism for high-dimensional means under sparsity and column-wise dependence
We consider two alternative tests to the Higher Criticism test of Donoho and Jin [Ann. Statist. 32 (2004) 962-994] for high-dimensional means under the sparsity of the nonzero means for sub-Gaussian
A note on shrinkage sliced inverse regression
We employ Lasso shrinkage within the context of sufficient dimension reduction to obtain a shrinkage sliced inverse regression estimator, which provides easier interpretations and better prediction
Coordinate-independent sparse sufficient dimension reduction and variable selection
TLDR
It can be shown that CISE would perform asymptotically as well as if the true irrelevant predictors were known, which is referred to as the oracle property.
Using Generalized Correlation to Effect Variable Selection in Very High Dimensional Problems
TLDR
An approach based on ranking generalized empirical correlations between the response variable and components of the explanatory vector is suggested, which can identify variables that are influential but not explicitly part of a predictive model.
Sliced Inverse Regression for Dimension Reduction
Abstract Modern advances in computing power have greatly widened scientists' scope in gathering and investigating information from many variables, information which might have been ignored in the
Nonparametric Smoothing and Lack-of-Fit Tests
TLDR
The nonparametric smoothing and lack of fit tests is one book that the authors really recommend you to read, to get more solutions in solving this problem.
...
...