# Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations

@article{Hu2019LinearCO,
title={Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations},
author={Shenglong Hu and Ke Ye},
journal={Mathematical Programming},
year={2019}
}
• Published 9 December 2019
• Mathematics, Computer Science
• Mathematical Programming
Low rank orthogonal tensor approximation (LROTA) is an important problem in tensor computations and their applications. A classical and widely used algorithm is the alternating polar decomposition method (APD). In this article, an improved version iAPD of the classical APD is proposed. For the first time, all the following four fundamental properties are established for iAPD: (i) the algorithm converges globally and the whole sequence converges to a KKT point without any assumption; (ii) it…
7 Citations
• Computer Science, Mathematics
• 2020
This paper derives a robust orthogonal tensor CPD model with Cauchy loss, which is resistant to heavy-tailed noise or outliers and shows that the whole sequence generated by the algorithm globally converges to a stationary point of the problem under consideration.
• Computer Science, Mathematics
ArXiv
• 2019
It turns out that well-known algorithms are all special cases of this general algorithmic framework and its symmetric variant, and the convergence results subsume the results found in the literature designed for those special cases.
• Yuning Yang
• Computer Science
Journal of Optimization Theory and Applications
• 2022
The presented results fill a gap left in Yang (SIAM J Matrix Anal Appl 41:1797–1825, 2020), where the approximation bound of that approximation algorithm was established when there is only one orthonormal factor.
• Mathematics, Computer Science
ArXiv
• 2021
This paper studies the gradient based Jacobi-type algorithms to maximize two classes of homogeneous polynomials with orthogonality constraints, and establishes their convergence properties, and proposes theJacobi-GP and Jacobi -MGP algorithms, and establish their global convergence without any further condition.
• Computer Science, Mathematics
Numerical Linear Algebra with Applications
• 2022
The problem of recovering an orthogonally decomposable tensor with a subset of elements distorted by noise with arbitrarily large magnitude can be solved through a system of coupled Sylvester-like equations and how to accelerate their solution by an alternating solver is shown.
• Shenglong Hu
• Computer Science, Physics
Science China Mathematics
• 2022
The main results are: (i) each (Z-)eigenvector/singular vector tuple of a generic tensor is nondegenerate, and (ii) each nonzero Z-eigen vector/ Singular vector tuples of an orthogonally decomposable tensors is nondEGenerate.

## References

SHOWING 1-10 OF 75 REFERENCES

• Yuning Yang
• Computer Science
SIAM J. Matrix Anal. Appl.
• 2020
The epsilon alternating least squares ($\epsilon$-ALS) is developed and analyzed for canonical polyadic decomposition (approximation) of a higher-order tensor where one or more of the factor matrices
• Computer Science
SIAM J. Matrix Anal. Appl.
• 2015
The conventional high-order power method is modified to address the desirable orthogonality via the polar decomposition and it is shown that for almost all tensors the orthogonal alternating least squares method converges globally.
• Computer Science
SIAM J. Matrix Anal. Appl.
• 2014
This paper partially addresses the missing piece by showing that for almost all tensors, the iterates generated by the alternating least squares method for the rank-one approximation converge globally.
• Computer Science, Mathematics
ArXiv
• 2019
A Jacobi-type algorithm to solve the low rank orthogonal approximation problem of symmetric tensors is proposed, and it is proved that an accumulation point is the unique limit point under some conditions.
• Mathematics
Numerische Mathematik
• 2018
It is established that the sequence generated by HOPM always converges globally and R-linearly for orthogonally decomposable tensors with order at least 3, and for almost all tensors, all the singular vector tuples are nondegenerate, and so, the HopM “typically” exhibits global R-linear convergence rate.
• Computer Science
SIAM J. Matrix Anal. Appl.
• 2008
The existence of an optimal approximation is theoretically guaranteed under certain conditions, and this optimal approximation yields a tensor decomposition where the diagonal of the core is maximized.
• Mathematics, Computer Science
SIAM J. Matrix Anal. Appl.
• 2018
This paper considers a family of Jacobi-type algorithms for a simultaneous orthogonal diagonalization problem of symmetric tensors and proposes and proves a newJacobi-based algorithm in the general setting and proves its global convergence for sufficiently smooth functions.
• Mathematics
SIAM J. Matrix Anal. Appl.
• 2012
Orthogonality-constrained versions of the CPD methods based on simultaneous matrix diagonalization and alternating least squares are presented and a simple proof of the existence of the optimal low-rank approximation of a tensor in the case that a factor matrix is columnwise orthonormal is given.
A local convergence theorem for calculating canonical low-rank tensor approximations (PARAFAC, CANDECOMP) by the alternating least squares algorithm is established. The main assumption is that the
• Mathematics, Computer Science
SIAM J. Matrix Anal. Appl.
• 2008
It is argued that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations, and a natural way of overcoming the ill-posedness of the low-rank approximation problem is proposed by using weak solutions when true solutions do not exist.