Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis

@article{Cichocki2014TensorDF,
  title={Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis},
  author={Andrzej Cichocki and Danilo P. Mandic and Lieven De Lathauwer and Guoxu Zhou and Qibin Zhao and Cesar F. Caiafa and A. Phan},
  journal={IEEE Signal Processing Magazine},
  year={2014},
  volume={32},
  pages={145-163}
}
The widespread use of multisensor technology and the emergence of big data sets have highlighted the limitations of standard flat-view matrix models and the necessity to move toward more versatile data analysis tools. We show that higher-order tensors (i.e., multiway arrays) enable such a fundamental paradigm shift toward models that are essentially polynomial, the uniqueness of which, unlike the matrix methods, is guaranteed under very mild and natural conditions. Benefiting from the power of… 

Multiscale analysis for higher-order tensors

A new adaptive, multi-scale tensor decomposition method for higher-order data inspired by hybrid linear modeling and subspace clustering techniques is introduced, where a given tensor is first permuted and then partitioned into several sub-tensors each of which can be represented as a low-rank tensor with increased representational efficiency.

Extension of PCA to Higher Order Data Structures: An Introduction to Tensors, Tensor Decompositions, and Tensor PCA

This paper reviews the major tensor decomposition methods with a focus on problems targeted by classical PCA and presents tensor methods that aim to solve three important challenges typically addressed by PCA: dimensionality reduction, i.e., low-rank tensor approximation; supervised learning, ie.

Tensor Decompositions and Practical Applications: A Hands-on Tutorial

The aim is not only to provide a necessary theoretical background for multi-linear analysis but also to equip researches and interested readers with an easy to read and understand practical examples in form of a Python code snippets.

New robust algorithms for sparse non-negative three-way tensor decompositions

This work focuses on PARAFAC and Tucker decompositions of three-way tensors with non-negativity and/or sparseness constraints, and proposes two decomposition algorithms which are robust to tensor order over-estimation errors and a desired practical property when the tensor rank is unknown.

Computing Large-Scale Matrix and Tensor Decomposition With Structured Factors: A Unified Nonconvex Optimization Perspective

Low-rank tensor and matrix decomposition models can serve a variety of purposes, e.g., data embedding, denoising, latent variable analysis, model parameter estimation, and big data compression; see [1]-[5] for surveys of applications.

Gradient-based approaches to learn tensor products

This work proposes various gradient-based methods to decompose tensors of matrix products as they appear in structured multiple-input multiple-output systems.

Tensor-based regression models and applications

A novel class of regression models called tensorvariate regression models, where the independent predictors and (or) dependent responses take the form of high-order tensorial representations are investigated, and a super-fast sequential tensor regression framework for general tensor sequences is proposed, which addresses issues of limited storage space and fast processing time allowed by dynamic environments.

Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions

This work discusses some fundamental TN models, their mathematical and graphical descriptions and associated learning algorithms for large-scale TDs and TNs, with many potential applications including: Anomaly detection, feature extraction, classification, cluster analysis, data fusion and integration, pattern recognition, predictive modeling, regression, time series analysis and multiway component analysis.

Nonnegative Matrix and Tensor Factorizations : An algorithmic perspective

Approximate low-rank matrix and tensor factorizations play fundamental roles in enhancing the data and extracting latent (hidden) components in model reduction, clustering, feature extraction, classification, and blind source separation applications.

Matrix product state decomposition in machine learning and signal processing

Several new algorithms are proposed for tensor object classification, which demonstrate an MPS-based approach as an efficient method against other tensor-based approaches, and new methods for colour image and video completion are introduced, which outperform the current state-of-the-art tensor completion algorithms.
...

References

SHOWING 1-10 OF 158 REFERENCES

Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions

This work discusses some fundamental TN models, their mathematical and graphical descriptions and associated learning algorithms for large-scale TDs and TNs, with many potential applications including: Anomaly detection, feature extraction, classification, cluster analysis, data fusion and integration, pattern recognition, predictive modeling, regression, time series analysis and multiway component analysis.

Block Component Analysis, a New Concept for Blind Source Separation

This paper explains how BTDs can be used for factor analysis and blind source separation, and discusses links with Canonical Polyadic Analysis (CPA) and Independent Component Analysis (ICA).

Sequential Unfolding SVD for Tensors With Applications in Array Signal Processing

A novel PARATREE tensor model is introduced, accompanied with sequential unfolding SVD (SUSVD) algorithm, which is orthogonal, fast and reliable to compute, and the order (or rank) of the decomposition can be adaptively adjusted.

Tensor Decompositions and Applications

This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order

Nonnegative Matrix and Tensor Factorizations - Applications to Exploratory Multi-way Data Analysis and Blind Source Separation

This book provides a broad survey of models and efficient algorithms for Nonnegative Matrix Factorization (NMF). This includes NMFs various extensions and modifications, especially Nonnegative Tensor

PARAFAC algorithms for large-scale problems

A survey of multilinear subspace learning for tensor data

Blind Signal Separation via Tensor Decomposition With Vandermonde Factor: Canonical Polyadic Decomposition

It is explained that, under new, relaxed uniqueness conditions, the number of components may simply be estimated as the rank of a matrix and an efficient algorithm for the computation of the factors that only resorts to basic linear algebra is proposed.

Tensor decompositions for feature extraction and classification of high dimensional datasets

This work proposes algorithms for feature extraction and classification based on orthogonal or nonnegative tensor (multi-array) decompositions, and higher order (multilinear) discriminant analysis (HODA), whereby input data are considered as tensors instead of more conventional vector or matrix representations.
...