# Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence

@article{Wang2015OrthogonalLR,
title={Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence},
author={Liqi Wang and Moody T. Chu and Bo Yu},
journal={SIAM J. Matrix Anal. Appl.},
year={2015},
volume={36},
pages={1-19}
}
• Published 15 January 2015
• Computer Science
• SIAM J. Matrix Anal. Appl.
With the notable exceptions of two cases---that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-1 approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption of semiorthogonality where only one set of components in the decomposed rank-1 tensors is required to be mutually…
28 Citations
• Yuning Yang
• Computer Science
SIAM J. Matrix Anal. Appl.
• 2020
The epsilon alternating least squares ($\epsilon$-ALS) is developed and analyzed for canonical polyadic decomposition (approximation) of a higher-order tensor where one or more of the factor matrices
• Mathematics, Computer Science
Mathematical Programming
• 2022
An improved version iAPD of the classical APD is proposed, which exhibits an overall sublinear convergence with an explicit rate which is sharper than the usual $O(1/k)$ for first order methods in optimization.
• Mathematics, Computer Science
ArXiv
• 2019
It is shown that optimal orthogonal approximations of rank greater than one cannot always be chosen to be symmetric.
• Yuning Yang
• Computer Science
Journal of Optimization Theory and Applications
• 2022
The presented results fill a gap left in Yang (SIAM J Matrix Anal Appl 41:1797–1825, 2020), where the approximation bound of that approximation algorithm was established when there is only one orthonormal factor.
• Computer Science, Mathematics
• 2020
This paper derives a robust orthogonal tensor CPD model with Cauchy loss, which is resistant to heavy-tailed noise or outliers and shows that the whole sequence generated by the algorithm globally converges to a stationary point of the problem under consideration.
• Chao Zeng
• Computer Science, Mathematics
Journal of Scientific Computing
• 2022
This work presents several properties of orthogonal rank, which are different from those of tensor rank in many aspects, and proposes an algorithm based on the augmented Lagrangian method that has a great advantage over the existing methods for strongly Orthogonal decompositions in terms of the approximation error.
• Computer Science, Mathematics
Numerical Linear Algebra with Applications
• 2022
The problem of recovering an orthogonally decomposable tensor with a subset of elements distorted by noise with arbitrarily large magnitude can be solved through a system of coupled Sylvester-like equations and how to accelerate their solution by an alternating solver is shown.
• A. Dax
• Mathematics
Advances in Linear Algebra &amp; Matrix Theory
• 2020
This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank-k approximation of a real m×n matrix, A. This method
• Computer Science, Mathematics
Frontiers of Mathematics
• 2023
Based on the alternating least squares (ALS) technique and the singular value decomposition (SVD) technique, the algorithms for finding the structure preserving best rank-one approximations of even

## References

SHOWING 1-10 OF 66 REFERENCES

• Computer Science
SIAM J. Matrix Anal. Appl.
• 2014
This paper partially addresses the missing piece by showing that for almost all tensors, the iterates generated by the alternating least squares method for the rank-one approximation converge globally.
• Mathematics
SIAM J. Matrix Anal. Appl.
• 2002
It is shown that a symmetric version of the above method converges under assumptions of convexity (or concavity) for the functional induced by the tensor in question, assumptions that are very often satisfied in practical applications.
• Mathematics, Computer Science
SIAM J. Matrix Anal. Appl.
• 2008
It is argued that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations, and a natural way of overcoming the ill-posedness of the low-rank approximation problem is proposed by using weak solutions when true solutions do not exist.
• Computer Science, Mathematics
SIAM J. Matrix Anal. Appl.
• 2011
This paper proposes a new iterative algorithm based on the Riemannian trust-region scheme, using the truncated conjugate-gradient method for solving the trust- Region subproblem and compares this new method with the well-known higher-order orthogonal iteration method and discusses the advantages over Newton-type methods.
• Mathematics, Computer Science
SIAM J. Matrix Anal. Appl.
• 2000
A multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal least-squares sense, by a tensor that has prespecified column rank value, rowRank value, etc.
• Computer Science, Mathematics
2009 17th European Signal Processing Conference
• 2009
• Computer Science
• 2010
This paper deals with the best low multilinear rank approximation of higher-order tensors, used as a tool for dimensionality reduction and signal subspace estimation.
• Mathematics
SIAM J. Matrix Anal. Appl.
• 2012
Orthogonality-constrained versions of the CPD methods based on simultaneous matrix diagonalization and alternating least squares are presented and a simple proof of the existence of the optimal low-rank approximation of a tensor in the case that a factor matrix is columnwise orthonormal is given.
• Mathematics, Computer Science
SIAM J. Matrix Anal. Appl.
• 2008
The notion of the generic symmetric rank is discussed, which, due to the work of Alexander and Hirschowitz, is now known for any values of dimension and order.