Multilinear Common Component Analysis via Kronecker Product Representation
@article{Yoshikawa2021MultilinearCC, title={Multilinear Common Component Analysis via Kronecker Product Representation}, author={Kohei Yoshikawa and Shuichi Kawano}, journal={Neural Computation}, year={2021}, volume={33}, pages={2853-2880} }
Abstract We consider the problem of extracting a common structure from multiple tensor data sets. For this purpose, we propose multilinear common component analysis (MCCA) based on Kronecker products of mode-wise covariance matrices. MCCA constructs a common basis represented by linear combinations of the original variables that lose little information of the multiple tensor data sets. We also develop an estimation algorithm for MCCA that guarantees mode-wise global convergence. Numerical…
References
SHOWING 1-10 OF 45 REFERENCES
STPCA: Sparse tensor Principal Component Analysis for feature extraction
- Computer ScienceProceedings of the 21st International Conference on Pattern Recognition (ICPR2012)
- 2012
Sparse Tensor Principal Component Analysis (STPCA) is proposed, which transforms the eigen-decomposition problem to a series of regression problems and can also address the occlusion problem.
Common component analysis for multiple covariance matrices
- Computer ScienceKDD
- 2011
A detailed analysis of CCA yields an effective initialization and iterative algorithms for the problem that has provable approximation guarantees w.r.t. the global maximum and allows one to choose the dimensionality for a given level of approximation error.
Sparse Higher-Order Principal Components Analysis
- Computer ScienceAISTATS
- 2012
The Sparse Higher-Order SVD and the Sparse CP Decomposition are proposed, which solve an `1-norm penalized relaxation of the single-factor CP optimization problem, thereby automatically selecting relevant features for each tensor factor.
Fast Multilinear Singular Value Decomposition for Structured Tensors
- Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008
This paper presents fast algorithms for computing the full and the rank-truncated HOSVD of third-order structured tensors, derived by considering two specific ways to unfold a structured tensor, leading to structured matrix unfoldings whose SVD can be efficiently computed.
Multilinear Sparse Principal Component Analysis
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2014
The key operation of MSPCA is to rewrite the MPCA into multilinear regression forms and relax it for sparse regression, which has the potential to outperform the existing PCA-based subspace learning algorithms.
Sparse common component analysis for multiple high-dimensional datasets via noncentered principal component analysis
- Computer ScienceStatistical Papers
- 2018
Sparsity is incorporated into CCA, and a novel strategy for sparse common component analysis based on L1-type regularized regression modeling is proposed to recover a sparse common structure efficiently in multiple dataset analysis.
Tensor Decompositions and Applications
- Computer ScienceSIAM Rev.
- 2009
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order…
MPCA: Multilinear Principal Component Analysis of Tensor Objects
- Computer ScienceIEEE Transactions on Neural Networks
- 2008
It is shown that even without a fully optimized design, an MPCA-based gait recognition module achieves highly competitive performance and compares favorably to the state-of-the-art gait recognizers.
Tensor graphical lasso (TeraLasso)
- Computer ScienceJournal of the Royal Statistical Society: Series B (Statistical Methodology)
- 2019
The paper introduces a multiway tensor generalization of the bigraphical lasso which uses a two‐way sparse Kronecker sum multivariate normal model for the precision matrix to model parsimoniously…
Common Principal Components in k Groups
- Mathematics
- 1984
Abstract This article generalizes the method of principal components to so-called “common principal components” as follows: Consider the hypothesis that the covariance matrices Σ i for k populations…