Semiparametric Tensor Factor Analysis by Iteratively Projected SVD
@article{Chen2020SemiparametricTF, title={Semiparametric Tensor Factor Analysis by Iteratively Projected SVD}, author={Elynn Y. Chen and Dong Xia and Chencheng Cai and Jianqing Fan}, journal={arXiv: Methodology}, year={2020} }
This paper introduces a general framework of Semiparametric TEnsor FActor analysis (STEFA) that focuses on the methodology and theory of low-rank tensor decomposition with auxiliary covariates. STEFA models extend tensor factor models by incorporating instrumental covariates in the loading matrices. We propose an algorithm of Iteratively Projected SVD (IP-SVD) for the semiparametric estimations. It iteratively projects tensor data onto the linear space spanned by covariates and applies SVD on…
Figures and Tables from this paper
10 Citations
Tensor Factor Model Estimation by Iterative Projection
- Computer Science
- 2020
Two approaches to estimate a tensor factor model by using iterative orthogonal projections of the original tensor time series are introduced, similar to higher order Orthogonal projection methods for tensor decomposition, but with significant differences and theoretical properties.
Generalized Low-rank plus Sparse Tensor Estimation by Fast Riemannian Optimization
- Computer Science
- 2022
A fast algorithm is proposed by integrating the Riemannian gradient descent and a novel gradient pruning procedure that achieves non-trivial error bounds for heavy-tailed tensor PCA whenever the noise has a finite 2 + ε moment.
Tensor Principal Component Analysis in High Dimensional CP Models
- Computer ScienceArXiv
- 2021
New computationally efficient composite PCA and concurrent orthogonalization algorithms for tensor CP decomposition with theoretical guarantees under mild incoherence conditions and implementations on synthetic data demonstrate practical superiority of the approach over existing methods.
CP Factor Model for Dynamic Tensors
- Computer Science
- 2021
A new high order projection estimator is proposed for such a factor model, utilizing the special structure and the idea of the higher order orthogonal iteration procedures commonly used in Tucker-type tensor factor model and general tensor CP decomposition procedures.
Rank Determination in Tensor Factor Model
- MathematicsSSRN Electronic Journal
- 2020
Factor model is an appealing and effective analytic tool for high-dimensional time series, with a wide range of applications in economics, finance and statistics. One of the fundamental issues in…
Vector or Matrix Factor Model? A Strong Rule Helps!
- Mathematics, Computer Science
- 2021
A family of randomised tests to check whether an eigenvalue diverges as the sample size passes to infinity (corresponding to having a common factor) or not and a de-randomised, “strong” decision rule to choose in favour or against the presence of common factors is proposed.
Online Change-point Detection for Matrix-valued Time Series with Latent Two-way Factor Structure
- Mathematics
- 2021
This paper proposes a novel methodology for the online detection of changepoints in the factor structure of large matrix time series. Our approach is based on the well-known fact that, in the…
One-way or Two-way Factor Model for Matrix Sequences?
- Mathematics, Computer Science
- 2021
This paper proposes a family of randomised tests to check whether a one-way or two-way factor structure exists or not, and proposes a de-randomized, “strong” decision rule to choose in favor or against the presence of common factors.
Tucker tensor factor models for high-dimensional higher-order tensor observations
- Computer Science
- 2022
Two sets of PCA based estimation procedures are proposed, including mainly convergence rates and asymptotic distributions of estimators of loading matrices, latent tensor factors and signal parts, which outperform existing auto-covariance based methods for tensor time series in terms of effects of estimation and tensor reconstruction.
Some recent methods for analyzing high dimensional time series
- Computer ScienceSpanish Journal of Statistics
- 2021
This article analyzes six recent advances in the analyses of high dimensional time series: dynamic quantiles for data visualization and clustering by dependency to split the series into homogeneous groups, and procedures for determining the number of factors, estimating DFM with cluster structure, and for modeling matrices of time series.
References
SHOWING 1-10 OF 46 REFERENCES
Sparse Higher-Order Principal Components Analysis
- Computer ScienceAISTATS
- 2012
The Sparse Higher-Order SVD and the Sparse CP Decomposition are proposed, which solve an `1-norm penalized relaxation of the single-factor CP optimization problem, thereby automatically selecting relevant features for each tensor factor.
Partially Observed Dynamic Tensor Response Regression
- Computer ScienceJournal of the American Statistical Association
- 2021
This article develops a regression model with partially observed dynamic tensor as the response and external covariates as the predictor, and introduces the low-rank, sparsity and fusion structures on the regression coefficient tensor, and considers a loss function projected over the observed entries.
Provable sparse tensor decomposition
- Computer Science
- 2015
A novel sparse tensor decomposition method, namely the tensor truncated power method, that incorporates variable selection in the estimation of decomposition components and significantly improves those shown in the existing non‐sparse decomposition methods.
Tensor Decompositions via Two-Mode Higher-Order SVD (HOSVD)
- Computer ScienceAISTATS
- 2017
This new method built on Kruskal's uniqueness theorem to decompose symmetric, nearly orthogonally decomposable tensors provably handles a greater level of noise compared to previous methods and achieves a high estimation accuracy.
Tensor SVD: Statistical and Computational Limits
- Computer ScienceIEEE Transactions on Information Theory
- 2018
A general framework for tensor singular value decomposition (tensor singular value decompposition (SVD)), which focuses on the methodology and theory for extracting the hidden low-rank structure from high-dimensional tensor data, is proposed.
PROJECTED PRINCIPAL COMPONENT ANALYSIS IN FACTOR MODELS.
- MathematicsAnnals of statistics
- 2016
A flexible semi-parametric factor model is proposed, which decomposes the factor loading matrix into the component that can be explained by subject-specific covariates and the orthogonal residual component and the rates of convergence of the smooth factor loading matrices are obtained are much faster than those of the conventional factor analysis.
Regularized Tensor Factorizations and Higher-Order Principal Components Analysis
- Computer Science
- 2012
F frameworks for sparse tensor factorizations or Sparse HOPCA are introduced based on heuristic algorithmic approaches and by solving penalized optimization problems related to the CP decomposition.
The Sup-norm Perturbation of HOSVD and Low Rank Tensor Denoising
- Computer ScienceJ. Mach. Learn. Res.
- 2019
The sup-norm perturbation bounds of HOSVD reveal unconventional phase transitions for statistical learning applications such as the exact clustering in high dimensional Gaussian mixture model and the exact support recovery in sub-tensor localizations.
Learning from Binary Multiway Data: Probabilistic Tensor Decomposition and its Statistical Optimality
- Computer ScienceJ. Mach. Learn. Res.
- 2020
A multilinear Bernoulli model is proposed, a rank-constrained likelihood-based estimation method is developed, and theoretical accuracy guarantees are obtained for the parameter tensor estimation.
Supervised singular value decomposition and its asymptotic properties
- Computer ScienceJ. Multivar. Anal.
- 2016