Greedy Approaches to Symmetric Orthogonal Tensor Decomposition

@article{Mu2017GreedyAT,
  title={Greedy Approaches to Symmetric Orthogonal Tensor Decomposition},
  author={Cun Mu and Daniel J. Hsu and Donald Goldfarb},
  journal={ArXiv},
  year={2017},
  volume={abs/1706.01169}
}
Finding the symmetric and orthogonal decomposition of a tensor is a recurring problem in signal processing, machine learning, and statistics. In this paper, we review, establish, and compare the perturbation bounds for two natural types of incremental rank-one approximation approaches. Numerical experiments and open questions are also presented and discussed. 

Figures from this paper

Optimal orthogonal approximations to symmetric tensors cannot always be chosen symmetric
TLDR
It is shown that optimal orthogonal approximations of rank greater than one cannot always be chosen to be symmetric.
Perturbation Bounds for Orthogonally Decomposable Tensors and Their Applications in High Dimensional Data Analysis
TLDR
The implications of deterministic perturbation bounds for singular values and vectors of orthogonally decomposable tensors are illustrated through three connected yet seemingly different high dimensional data analysis tasks: tensor SVD, tensor regression and estimation of latent variable models, leading to new insights in each of these settings.
Recovering orthogonal tensors under arbitrarily strong, but locally correlated, noise
TLDR
The problem of recovering an orthogonally decomposable tensor with a subset of elements distorted by noise with arbitrarily large magnitude can be solved through a system of coupled Sylvester-like equations and how to accelerate their solution by an alternating solver is shown.
Successive Partial-Symmetric Rank-One Algorithms for Almost Unitarily Decomposable Conjugate Partial-Symmetric Tensors
In this paper, we introduce the almost unitarily decomposable conjugate partial-symmetric tensors, which are different from the commonly studied orthogonally decomposable tensors by involving the
Perturbation Bounds for (Nearly) Orthogonally Decomposable Tensors
We develop deterministic perturbation bounds for singular values and vectors of orthogonally decomposable tensors, in a spirit similar to classical results for matrices such as those due to Weyl,
A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration
TLDR
It is proved that one-step HOOI is also optimal in terms of tensor reconstruction and can be used to lower the computational cost and the perturbation results are extended to the case that only partial modes of $\bcT$ have low-rank structure.
Using negative curvature in solving nonlinear programs
TLDR
This work generalizes the approach of using negative curvature directions from unconstrained optimization to equality constrained problems and proves that the proposednegative curvature method is guaranteed to converge to a stationary point satisfying second-order necessary conditions.
Revisiting Skip-Gram Negative Sampling Model with Regularization
TLDR
This work revisits skip-gram negative sampling and rectifies the SGNS model with quadratic regularization, and shows that this simple modification suffices to structure the solution in the desired manner.
Revisiting Skip-Gram Negative Sampling Model with Rectification
TLDR
This work revisits skip-gram negative sampling and rectifies the SGNS model with quadratic regularization, and shows that this simple modification suffices to structure the solution in the desired manner.
Hypergraph Spectral Clustering for Point Cloud Segmentation
TLDR
This work investigates the power of hypergraph spectral analysis in unsupervised segmentation of 3D point clouds, and develops a clustering-based segmentation method based on spectral component strengths.

References

SHOWING 1-10 OF 37 REFERENCES
Orthogonal Tensor Decompositions
  • T. Kolda
  • Mathematics, Computer Science
    SIAM J. Matrix Anal. Appl.
  • 2001
TLDR
The orthogonal decomposition of tensors (also known as multidimensional arrays or n-way arrays) using two different definitions of orthogonality are explored using a counterexample to a tensor extension of the Eckart--Young SVD approximation theorem.
Singular values and eigenvalues of tensors: a variational approach
  • Lek-Heng Lim
  • Mathematics
    1st IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, 2005.
  • 2005
We propose a theory of eigenvalues, eigenvectors, singular values, and singular vectors for tensors based on a constrained variational approach much like the Rayleigh quotient for symmetric matrix
Tensor Decompositions via Two-Mode Higher-Order SVD (HOSVD)
TLDR
This new method built on Kruskal's uniqueness theorem to decompose symmetric, nearly orthogonally decomposable tensors provably handles a greater level of noise compared to previous methods and achieves a high estimation accuracy.
Successive Rank-One Approximations for Nearly Orthogonally Decomposable Symmetric Tensors
TLDR
This paper shows that even in the presence of perturbation, SROA can still robustly recover the symmetric canonical decomposition of the underlying tensor.
Rank-One Approximation to High Order Tensors
TLDR
The singular value decomposition has been extensively used in engineering and statistical applications and certain properties of this decomposition are investigated as well as numerical algorithms.
Orthogonal Decomposition of Symmetric Tensors
  • Elina Robeva
  • Mathematics, Computer Science
    SIAM J. Matrix Anal. Appl.
  • 2016
TLDR
This work forms a set of polynomial equations that vanish on the odeco variety and conjecture that these polynomials generate its prime ideal, and proves this conjecture in some cases and gives strong evidence for its overall correctness.
Tensor decompositions for learning latent variable models
TLDR
A detailed analysis of a robust tensor power method is provided, establishing an analogue of Wedin's perturbation theorem for the singular vectors of matrices, and implies a robust and computationally tractable estimation approach for several popular latent variable models.
On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors
TLDR
A multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal least-squares sense, by a tensor that has prespecified column rank value, rowRank value, etc.
Tensor principal component analysis via convex optimization
TLDR
The tensor PCA problem can be solved by means of matrix optimization under a rank-one constraint, for which two solution methods are proposed: imposing a nuclear norm penalty in the objective to enforce a low-rank solution and relaxing the rank- one constraint by semidefinite programming.
The Best Rank-1 Approximation of a Symmetric Tensor and Related Spherical Optimization Problems
TLDR
A positive lower bound for the best rank-$1$ approximation ratio of a symmetric tensor is given and a higher order polynomial spherical optimization problem can be reformulated as a multilinear spherical optimized problem.
...
...