Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives

@article{Cichocki2017TensorNF,
  title={Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives},
  author={Andrzej Cichocki and A. Phan and Qibin Zhao and Namgil Lee and I. Oseledets and Masashi Sugiyama and Danilo P. Mandic},
  journal={Found. Trends Mach. Learn.},
  year={2017},
  volume={9},
  pages={431-673}
}
This monograph builds on Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions by discussing tensor network models for super-compressed higher-order representation of data/parameters and cost functions, together with an outline of their applications in machine learning and data analytics. A particular emphasis is on elucidating, through graphical illustrations, that by virtue of the underlying low-rank tensor approximations and… 
Tensor Networks for Dimensionality Reduction, Big Data and Deep Learning
  • A. Cichocki
  • Computer Science
    Advances in Data Analysis with Computational Intelligence Methods
  • 2018
TLDR
This work elucidates, through graphical illustrations, that low-rank tensor approximations and sophisticated contractions of core tensors, tensor networks have the ability to perform distributed computations on otherwise prohibitively large volume of data/parameters.
Reducing Computational Complexity of Tensor Contractions via Tensor-Train Networks
TENSORS (multi-way arrays) and Tensor Decompositions (TDs) have recently received tremendous attention in the data analytics community, due to their ability to mitigate the Curse of Dimensionality
Nonnegative canonical tensor decomposition with linear constraints: nnCANDELINC
TLDR
This work derives several results required to understand the specificity of nnCANDELINC, focusing on the difficulties of preserving the nonnegative rank of a tensor to its Tucker core and comparing the real valued to nonnegative case.
Tensor Networks for Latent Variable Analysis: Higher Order Canonical Polyadic Decomposition
TLDR
A novel method for CPD of higher order tensors, which rests upon a simple tensor network of representative inter-connected core tensors of orders not higher than 3.
Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion
TLDR
Taking advantages of high compressibility of the recently proposed tensor ring (TR) decomposition, a new model for tensor completion problem is proposed through introducing convex surrogates of tensor low-rank assumption on latent Tensor ring factors, which makes it possible for the Schatten norm regularization based models to be solved at much smaller scale.
Approximation with Tensor Networks. Part II: Approximation Rates for Smoothness Classes
TLDR
It is shown how classical approximation tools, such as polynomials or splines (with fixed or free knots), can be encoded as a tensor network with controlled complexity, and direct (Jackson) inequalities for the approximation spaces of tensor networks are derived.
Tensor-Train Parameterization for Ultra Dimensionality Reduction
TLDR
A Tensor-train parameterization for ultra dimensionality reduction (TTPUDR) is proposed in this paper, where the conventional LPP mapping is tensorized through tensor-trains and the objective function in the traditional LPP is substituted with the Frobenius norm instead of the squared Frobenii norm to enhance the robustness of the model.
Multi-Branch Tensor Network Structure for Tensor-Train Discriminant Analysis
TLDR
Multi-branch implementations of TTDA are shown to achieve lower storage and computational complexity while providing improved classification performance with respect to both Tucker and TT based supervised learning methods.
Graph Regularized Tensor Train Decomposition
TLDR
This paper proposes a graph regularized tensor train (GRTT) decomposition that learns a low-rank Tensor train model that preserves the local relationships between tensor samples and an efficient algorithm is proposed to solve it.
...
...

References

SHOWING 1-10 OF 273 REFERENCES
Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems
TLDR
The main objective of this paper is to show how tensor networks can be used to solve a wide class of big data optimization problems by applying tensorization and performing all operations using relatively small size matrices and tensors and applying iteratively optimized and approximative tensor contractions.
Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition
TLDR
The novel algorithms developed for the tensor train decomposition update, in an alternating way, one or several core tensors at each iteration, and exhibit enhanced mathematical tractability and scalability to exceedingly large-scale data tensors.
Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions
TLDR
This work discusses some fundamental TN models, their mathematical and graphical descriptions and associated learning algorithms for large-scale TDs and TNs, with many potential applications including: Anomaly detection, feature extraction, classification, cluster analysis, data fusion and integration, pattern recognition, predictive modeling, regression, time series analysis and multiway component analysis.
Tensor Networks and Hierarchical Tensors for the Solution of High-Dimensional Partial Differential Equations
TLDR
A survey of developments of techniques for the computation of hierarchical low-rank approximations, including local optimisation techniques on Riemannian manifolds as well as truncated iteration methods, which can be applied for solving high-dimensional partial differential equations.
The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format
TLDR
This article shows how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation.
Riemannian Optimization for High-Dimensional Tensor Completion
TLDR
A nonlinear conjugate gradient scheme within the framework of Riemannian optimization which exploits this favorable scaling to obtain competitive reconstructions from uniform random sampling of few entries compared to adaptive sampling techniques such as cross-approximation.
Tucker Tensor Regression and Neuroimaging Analysis
TLDR
A tensor regression model based on the more flexible Tucker decomposition, which allows different number of factors along each mode and leads to several advantages that are particularly suited to neuroimaging analysis, including further reduction of the number of free parameters.
Tensor-based regression models and applications
TLDR
A novel class of regression models called tensorvariate regression models, where the independent predictors and (or) dependent responses take the form of high-order tensorial representations are investigated, and a super-fast sequential tensor regression framework for general tensor sequences is proposed, which addresses issues of limited storage space and fast processing time allowed by dynamic environments.
Fully Scalable Methods for Distributed Tensor Factorization
TLDR
This paper proposes two distributed tensor factorization methods, CDTF and SALS, which are scalable with all aspects of data and show a trade-off between convergence speed and memory requirements.
...
...