Tensor Decompositions and Applications

@article{Kolda2009TensorDA,
  title={Tensor Decompositions and Applications},
  author={Tamara G. Kolda and Brett W. Bader},
  journal={SIAM Rev.},
  year={2009},
  volume={51},
  pages={455-500}
}
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be… 
An Optimization Approach for Fitting Canonical Tensor Decompositions.
TLDR
The mathematical calculation of the derivatives is discussed and it is shown that they can be computed efficiently, at the same cost as one iteration of ALS, which is much more accurate than ALS and orders of magnitude faster than NLS.
A scalable optimization approach for fitting canonical tensor decompositions
TLDR
The mathematical calculation of the derivatives of the canonical tensor decomposition is discussed and it is shown that they can be computed efficiently, at the same cost as one iteration of ALS, which is more accurate than ALS and faster than NLS in terms of total computation time.
Generalized Visual Information Analysis Via Tensorial Algebra
TLDR
The generalized t-matrix algorithms, namely TSVD, THOSVD, TPCA, T2DPCA and TGCA, are applied to low-rank approximation, reconstruction and supervised classification of images and experiments show that the t-Matrix algorithms compare favorably with standard matrix algorithms.
Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition
TLDR
The novel algorithms developed for the tensor train decomposition update, in an alternating way, one or several core tensors at each iteration, and exhibit enhanced mathematical tractability and scalability to exceedingly large-scale data tensors.
Breaking the Curse of Dimensionality Using Decompositions of Incomplete Tensors: Tensor-based scientific computing in big data analysis
TLDR
Higher-order tensors and their decompositions are abundantly present in domains such as signal processing, scientific computing, and quantum information theory and can be exploited by using compressed sensing methods working on incomplete tensors, i.e., tensors with only a few known elements.
Non-redundant tensor decomposition
TLDR
A non-redundant tensor decomposition is presented that includes SVD as a particular case, describes a tensor as a set of variables, defines an upper bound for the rank of tensors, and does not have redundancy as in the cases of CP and Tucker decompositions.
Introduction to Tensor Decompositions and their Applications in Machine Learning
TLDR
Basic tensor concepts are introduced, why tensors can be considered more rigid than matrices with respect to the uniqueness of their decomposition, the most important factorization algorithms and their properties are explained, and concrete examples of tensor decomposition applications in machine learning are provided.
Sequential Unfolding SVD for Tensors With Applications in Array Signal Processing
TLDR
A novel PARATREE tensor model is introduced, accompanied with sequential unfolding SVD (SUSVD) algorithm, which is orthogonal, fast and reliable to compute, and the order (or rank) of the decomposition can be adaptively adjusted.
Non-Orthogonal Tensor Diagonalization, a Tool for Block Tensor Decompositions
TLDR
This paper presents algorithms for non-orthogonal tensor diagonalization, which can be used for block tensor decomposition, and has a low computational complexity, comparable to complexity of the fastest available canonical polyadic decomposition algorithms.
Computing Dense Tensor Decompositions with Optimal Dimension Trees
TLDR
It is shown that finding an optimal dimension tree for an N-dimensional tensor is NP-hard for both CP and Tucker decompositions, and faster exact algorithms are provided for finding this tree.
...
...

References

SHOWING 1-10 OF 343 REFERENCES
From Matrix to Tensor : Multilinear Algebra and Signal Processing
TLDR
An overview of some important tensor algebraic concepts, and some of their implications in signal processing, and the expansion of a higher-order tensor in a sum of non-orthogonal rank-1 components is investigated.
Efficient MATLAB Computations with Sparse and Factored Tensors
TLDR
This paper considers how specially structured tensors allow for efficient storage and computation, and proposes storing sparse tensors using coordinate format and describes the computational efficiency of this scheme for various mathematical operations, including those typical to tensor decomposition algorithms.
Tensor-CUR decompositions for tensor-based data
TLDR
In the hyperspectral data application, the tensor-CUR decomposition is used to compress the data, and it is shown that classification quality is not substantially reduced even after substantial data compression.
Efficient Computer Manipulation of Tensor Products with Applications to Multidimensional Approximation
TLDR
The objective of this paper is to make it possible to perform matrix-vector operations in tensor product spaces, using only the factors instead of the tensor-product operators themselves, to produce efficient algorithms for solving systems of linear equations with coef- ficient matrices being tensor products of nonsingular matrices.
Low rank Tucker-type tensor approximation to classical potentials
AbstractThis paper investigates best rank-(r1,..., rd) Tucker tensor approximation of higher-order tensors arising from the discretization of linear operators and functions in ℝd. Super-convergence
Multilinear operators for higher-order decompositions
  • T. Kolda
  • Mathematics, Computer Science
  • 2006
TLDR
Two new multilinear operators are proposed for expressing the matrix compositions that are needed in the Tucker and PARAFAC (CANDECOMP) decompositions and one of them is shorthand for performing an n-mode matrix multiplication for every mode of a given tensor.
Independent component analysis and (simultaneous) third-order tensor diagonalization
TLDR
It is shown that simultaneous optimal diagonalization of "third-order tensor slices" of the fourth-order cumulant is a suitable strategy and is similar in spirit to the efficient JADE-algorithm.
Multilinear subspace analysis of image ensembles
TLDR
A dimensionality reduction algorithm that enables subspace analysis within the multilinear framework, based on a tensor decomposition known as the N-mode SVD, the natural extension to tensors of the conventional matrix singular value decomposition (SVD).
Linear image coding for regression and classification using the tensor-rank principle
  • A. Shashua, Anat Levin
  • Computer Science
    Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001
  • 2001
TLDR
It is found that for regression the tensor-rank coding, as a dimensionality reduction technique, significantly outperforms other techniques like PCA.
A Jacobi-Type Method for Computing Orthogonal Tensor Decompositions
TLDR
An algorithm for tensors of the form A that is an extension of the Jacobi SVD algorithm for matrices is proposed that is to “condense” a tensor in fewer nonzero entries using orthogonal transformations.
...
...