• Corpus ID: 237373938

Sparse principal component analysis for high-dimensional stationary time series

@inproceedings{Fujimori2021SparsePC,
  title={Sparse principal component analysis for high-dimensional stationary time series},
  author={Kou Fujimori and Yuichi Goto and Yan Liu and Masanobu Taniguchi},
  year={2021}
}
We consider the sparse principal component analysis for high-dimensional stationary processes. The standard principal component analysis performs poorly when the dimension of the process is large. We establish the oracle inequalities for penalized principal component estimators for the processes including heavy-tailed time series. The rate of convergence of the estimators is established. We also elucidate the theoretical rate for choosing the tuning parameter in penalized estimators. The… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 36 REFERENCES
Augmented sparse principal component analysis for high dimensional data
We study the problem of estimating the leading eigenvectors of a high-dimensional population covariance matrix based on independent Gaussian observations. We establish lower bounds on the rates of
Regularized estimation in sparse high-dimensional time series models
Many scientific and economic problems involve the analysis of high-dimensional time series datasets. However, theoretical studies in high-dimensional statistics to date rely primarily on the
Sparse Principal Component Analysis and Iterative Thresholding
Principal component analysis (PCA) is a classical dimension reduction method which projects data onto the principal subspace spanned by the leading eigenvectors of the covariance matrix. However, it
Optimal detection of sparse principal components in high dimension
We perform a finite sample analysis of the detection levels for sparse principal components of a high-dimensional covariance matrix. Our minimax optimal test is based on a sparse eigenvalue
Sparse Principal Component Analysis
Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA suffers from the fact that each principal component is a linear combination of all the
ASYMPTOTIC THEORY FOR PRINCIPAL COMPONENT ANALYSIS
Abstract : The asymptotic distribution of the characteristic roots and (normalized) vectors of a sample covariance matrix is given when the observations are from a multivariate normal distribution
MINIMAX BOUNDS FOR SPARSE PCA WITH NOISY HIGH-DIMENSIONAL DATA.
TLDR
A lower bound on the minimax risk of estimators under the l2 loss is established, in the joint limit as dimension and sample size increase to infinity, under various models of sparsity for the population eigenvectors.
MINIMAX SPARSE PRINCIPAL SUBSPACE ESTIMATION IN HIGH DIMENSIONS
We study sparse principal components analysis in high dimensions, where p (the number of variables) can be much larger than n (the number of observations), and analyze the problem of estimating the
Sparse PCA: Optimal rates and adaptive estimation
Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. This paper considers both minimax and adaptive estimation of the
Sparse principal component analysis via regularized low rank matrix approximation
Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can
...
1
2
3
4
...