• Corpus ID: 240288569

The Set of Orthogonal Tensor Trains

@inproceedings{Semnani2021TheSO,
  title={The Set of Orthogonal Tensor Trains},
  author={Pardis Semnani and Elina Robeva},
  year={2021}
}
In this paper we study the set of tensors that admit a special type of decomposition called an orthogonal tensor train decomposition. Finding equations defining varieties of low-rank tensors is generally a hard problem, however, the set of orthogonally decomposable tensors is defined by appealing quadratic equations. The tensors we consider are an extension of orthogonally decomposable tensors. We show that they are defined by similar quadratic equations, as well as an interesting higher-degree… 

Figures from this paper

References

SHOWING 1-10 OF 19 REFERENCES
Symmetric Orthogonal Tensor Decomposition is Trivial
  • T. Kolda
  • Mathematics, Computer Science
    ArXiv
  • 2015
TLDR
If an orthogonal decomposition of an $m-way $n-dimensional symmetric tensor exists, this work proposes a novel method to compute it that reduces to an $n \times n$ symmetric matrix eigenproblem.
Orthogonal Decomposition of Symmetric Tensors
  • Elina Robeva
  • Mathematics, Computer Science
    SIAM J. Matrix Anal. Appl.
  • 2016
TLDR
This work forms a set of polynomial equations that vanish on the odeco variety and conjecture that these polynomials generate its prime ideal, and proves this conjecture in some cases and gives strong evidence for its overall correctness.
Tensor-Train Decomposition
TLDR
The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
Most Tensor Problems Are NP-Hard
TLDR
It is proved that multilinear (tensor) analogues of many efficiently computable problems in numerical linear algebra are NP-hard and how computing the combinatorial hyperdeterminant is NP-, #P-, and VNP-hard.
Tensor decompositions for learning latent variable models
TLDR
A detailed analysis of a robust tensor power method is provided, establishing an analogue of Wedin's perturbation theorem for the singular vectors of matrices, and implies a robust and computationally tractable estimation approach for several popular latent variable models.
Tensor Decompositions and Applications
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order
Hand-waving and Interpretive Dance: An Introductory Course on Tensor Networks
The curse of dimensionality associated with the Hilbert space of spin systems provides a significant obstruction to the study of condensed matter systems. Tensor networks have proven an important
Tensor Networks in a Nutshell
TLDR
This tutorial concludes the tutorial with tensor contractions evaluating combinatorial counting problems and Penrose's tensor contraction algorithm, returning the number of edge-colorings of regular planar graphs.
Tensors: Geometry and Applications
TLDR
This book has three intended uses: a classroom textbook, a reference work for researchers in the sciences, and an account of classical and modern results in (aspects of) the theory that will be of interest to researchers in geometry.
...
...