Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence
@article{Wang2015OrthogonalLR, title={Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence}, author={Liqi Wang and Moody T. Chu and Bo Yu}, journal={SIAM J. Matrix Anal. Appl.}, year={2015}, volume={36}, pages={1-19} }
With the notable exceptions of two cases---that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-1 approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption of semiorthogonality where only one set of components in the decomposed rank-1 tensors is required to be mutually…
28 Citations
The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence
- Computer ScienceSIAM J. Matrix Anal. Appl.
- 2020
The epsilon alternating least squares ($\epsilon$-ALS) is developed and analyzed for canonical polyadic decomposition (approximation) of a higher-order tensor where one or more of the factor matrices…
Linear convergence of an alternating polar decomposition method for low rank orthogonal tensor approximations
- Mathematics, Computer ScienceMathematical Programming
- 2022
An improved version iAPD of the classical APD is proposed, which exhibits an overall sublinear convergence with an explicit rate which is sharper than the usual $O(1/k)$ for first order methods in optimization.
Optimal orthogonal approximations to symmetric tensors cannot always be chosen symmetric
- Mathematics, Computer ScienceArXiv
- 2019
It is shown that optimal orthogonal approximations of rank greater than one cannot always be chosen to be symmetric.
On Approximation Algorithm for Orthogonal Low-Rank Tensor Approximation
- Computer ScienceJournal of Optimization Theory and Applications
- 2022
The presented results fill a gap left in Yang (SIAM J Matrix Anal Appl 41:1797–1825, 2020), where the approximation bound of that approximation algorithm was established when there is only one orthonormal factor.
Half-Quadratic Alternating Direction Method of Multipliers for Robust Orthogonal Tensor Approximation
- Computer Science, Mathematics
- 2020
This paper derives a robust orthogonal tensor CPD model with Cauchy loss, which is resistant to heavy-tailed noise or outliers and shows that the whole sequence generated by the algorithm globally converges to a stationary point of the problem under consideration.
Rank Properties and Computational Methods for Orthogonal Tensor Decompositions
- Computer Science, MathematicsJournal of Scientific Computing
- 2022
This work presents several properties of orthogonal rank, which are different from those of tensor rank in many aspects, and proposes an algorithm based on the augmented Lagrangian method that has a great advantage over the existing methods for strongly Orthogonal decompositions in terms of the approximation error.
Recovering orthogonal tensors under arbitrarily strong, but locally correlated, noise
- Computer Science, MathematicsNumerical Linear Algebra with Applications
- 2022
The problem of recovering an orthogonally decomposable tensor with a subset of elements distorted by noise with arbitrarily large magnitude can be solved through a system of coupled Sylvester-like equations and how to accelerate their solution by an alternating solver is shown.
The Equivalence between Orthogonal Iterations and Alternating Least Squares
- MathematicsAdvances in Linear Algebra & Matrix Theory
- 2020
This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank-k approximation of a real m×n matrix, A. This method…
Algorithms for Structure Preserving Best Rank-one Approximations of Partially Symmetric Tensors
- Computer Science, MathematicsFrontiers of Mathematics
- 2023
Based on the alternating least squares (ALS) technique and the singular value decomposition (SVD) technique, the algorithms for finding the structure preserving best rank-one approximations of even…
References
SHOWING 1-10 OF 66 REFERENCES
On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
- Computer ScienceSIAM J. Matrix Anal. Appl.
- 2014
This paper partially addresses the missing piece by showing that for almost all tensors, the iterates generated by the alternating least squares method for the rank-one approximation converge globally.
On the Best Rank-1 Approximation of Higher-Order Supersymmetric Tensors
- MathematicsSIAM J. Matrix Anal. Appl.
- 2002
It is shown that a symmetric version of the above method converges under assumptions of convexity (or concavity) for the functional induced by the tensor in question, assumptions that are very often satisfied in practical applications.
Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008
It is argued that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations, and a natural way of overcoming the ill-posedness of the low-rank approximation problem is proposed by using weak solutions when true solutions do not exist.
Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme
- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2011
This paper proposes a new iterative algorithm based on the Riemannian trust-region scheme, using the truncated conjugate-gradient method for solving the trust- Region subproblem and compares this new method with the well-known higher-order orthogonal iteration method and discusses the advantages over Newton-type methods.
On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors
- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2000
A multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal least-squares sense, by a tensor that has prespecified column rank value, rowRank value, etc.
Subtracting a best rank-1 approximation may increase tensor rank
- Computer Science, Mathematics2009 17th European Signal Processing Conference
- 2009
Monotonically convergent algorithms for symmetric tensor approximation
- Computer Science, Mathematics
- 2013
On the best low multilinear rank approximation of higher-order tensors
- Computer Science
- 2010
This paper deals with the best low multilinear rank approximation of higher-order tensors, used as a tool for dimensionality reduction and signal subspace estimation.
Canonical Polyadic Decomposition with a Columnwise Orthonormal Factor Matrix
- MathematicsSIAM J. Matrix Anal. Appl.
- 2012
Orthogonality-constrained versions of the CPD methods based on simultaneous matrix diagonalization and alternating least squares are presented and a simple proof of the existence of the optimal low-rank approximation of a tensor in the case that a factor matrix is columnwise orthonormal is given.
Symmetric Tensors and Symmetric Tensor Rank
- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008
The notion of the generic symmetric rank is discussed, which, due to the work of Alexander and Hirschowitz, is now known for any values of dimension and order.