Globally convergent Jacobi-type algorithms for simultaneous orthogonal symmetric tensor diagonalization

@article{Li2018GloballyCJ,
  title={Globally convergent Jacobi-type algorithms for simultaneous orthogonal symmetric tensor diagonalization},
  author={Jianze Li and Konstantin Usevich and Pierre Comon},
  journal={SIAM J. Matrix Anal. Appl.},
  year={2018},
  volume={39},
  pages={1-22}
}
In this paper, we consider a family of Jacobi-type algorithms for a simultaneous orthogonal diagonalization problem of symmetric tensors. For the Jacobi-based algorithm of [M. Ishteva, P.-A. Absil, and P. Van Dooren, SIAM J. Matrix Anal. Appl., 34 (2013), pp. 651--672], we prove its global convergence for simultaneous orthogonal diagonalization of symmetric matrices and 3rd-order tensors. We also propose a new Jacobi-based algorithm in the general setting and prove its global convergence for… Expand
Convergence of a Jacobi-type method for the approximate orthogonal tensor diagonalization
TLDR
The alternating least squares Jacobi-type algorithm that maximizes the squares of the diagonal entries of A that works on 2×2×2 subtensors and can be generalized to work on the higher-order tensors. Expand
Approximate Matrix and Tensor Diagonalization by Unitary Transformations: Convergence of Jacobi-Type Algorithms
We propose a gradient-based Jacobi algorithm for a class of maximization problems on the unitary group, with a focus on approximate diagonalization of complex matrices and tensors by unitaryExpand
N A ] 2 N ov 2 01 9 JACOBI-TYPE ALGORITHM FOR LOW RANK ORTHOGONAL APPROXIMATION OF SYMMETRIC TENSORS AND ITS CONVERGENCE ANALYSIS
In this paper, we propose a Jacobi-type algorithm to solve the low rank orthogonal approximation problem of symmetric tensors. This algorithm includes as a special case the well-known Jacobi CoM2Expand
Jacobi-type algorithms for homogeneous polynomial optimization on Stiefel manifolds with applications to tensor approximations
TLDR
This paper studies the gradient based Jacobi-type algorithms to maximize two classes of homogeneous polynomials with orthogonality constraints, and establishes their convergence properties, and proposes theJacobi-GP and Jacobi -MGP algorithms, and establish their global convergence without any further condition. Expand
On approximate diagonalization of third order symmetric tensors by orthogonal transformations
In this paper, we study the approximate orthogonal diagonalization problem of third order symmetric tensors. We define several classes of approximately diagonal tensors, including the onesExpand
Linear Convergence of an Alternating Polar Decomposition Method for Low Rank Orthogonal Tensor Approximations
Low rank orthogonal tensor approximation (LROTA) is an important problem in tensor computations and their applications. A classical and widely used algorithm is the alternating polar decompositionExpand
The Epsilon-Alternating Least Squares for Orthogonal Low-Rank Tensor Approximation and Its Global Convergence
  • Yuning Yang
  • Computer Science, Mathematics
  • SIAM J. Matrix Anal. Appl.
  • 2020
The epsilon alternating least squares ($\epsilon$-ALS) is developed and analyzed for canonical polyadic decomposition (approximation) of a higher-order tensor where one or more of the factor matricesExpand
Gradient based block coordinate descent algorithms for joint approximate diagonalization of matrices
TLDR
A gradient based block coordinate descent (BCD-G) framework to solve the joint approximate diagonalization of matrices defined on the product of the complex Stiefel manifold and the special linear group, and the algorithms specialize as the gradient based Jacobi-type algorithms. Expand
Half-Quadratic Alternating Direction Method of Multipliers for Robust Orthogonal Tensor Approximation
Higher-order tensor canonical polyadic decomposition (CPD) with one or more of the latent factor matrices being columnwisely orthonormal has been well studied in recent years. However, most existingExpand
On Approximation Algorithm for Orthogonal Low-Rank Tensor Approximation
The goal of this work is to fill a gap in [Yang, SIAM J. Matrix Anal. Appl, 41 (2020), 1797–1825]. In that work, an approximation procedure was proposed for orthogonal low-rank tensor approximation;Expand
...
1
2
...

References

SHOWING 1-10 OF 39 REFERENCES
Jacobi Algorithm for the Best Low Multilinear Rank Approximation of Symmetric Tensors
TLDR
An algorithm based on Jacobi rotations is proposed, for which symmetry is preserved at each iteration, for the symmetric best low multilinear rank approximation of third-order symmetric tensors. Expand
Structure-Preserving Low Multilinear Rank Approximation of Antisymmetric Tensors
This paper is concerned with low multilinear rank approximations to antisymmetric tensors, that is, multivariate arrays for which the entries change sign when permuting pairs of indices. We showExpand
Proximal alternating linearized minimization for nonconvex and nonsmooth problems
TLDR
A self-contained convergence analysis framework is derived and it is established that each bounded sequence generated by PALM globally converges to a critical point. Expand
Symmetric Tensors and Symmetric Tensor Rank
TLDR
The notion of the generic symmetric rank is discussed, which, due to the work of Alexander and Hirschowitz, is now known for any values of dimension and order. Expand
Independent component analysis and (simultaneous) third-order tensor diagonalization
TLDR
It is shown that simultaneous optimal diagonalization of "third-order tensor slices" of the fourth-order cumulant is a suitable strategy and is similar in spirit to the efficient JADE-algorithm. Expand
On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors
TLDR
A multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal least-squares sense, by a tensor that has prespecified column rank value, rowRank value, etc. Expand
On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors
  • Liqi Wang, M. Chu
  • Mathematics, Computer Science
  • SIAM J. Matrix Anal. Appl.
  • 2014
TLDR
This paper partially addresses the missing piece by showing that for almost all tensors, the iterates generated by the alternating least squares method for the rank-one approximation converge globally. Expand
A Jacobi-Type Method for Computing Orthogonal Tensor Decompositions
TLDR
An algorithm for tensors of the form A that is an extension of the Jacobi SVD algorithm for matrices is proposed that is to “condense” a tensor in fewer nonzero entries using orthogonal transformations. Expand
Orthogonal Tensor Decompositions
  • T. Kolda
  • Computer Science, Mathematics
  • SIAM J. Matrix Anal. Appl.
  • 2001
TLDR
The orthogonal decomposition of tensors (also known as multidimensional arrays or n-way arrays) using two different definitions of orthogonality are explored using a counterexample to a tensor extension of the Eckart--Young SVD approximation theorem. Expand
A new convergence proof for the higher-order power method and generalizations
A proof for the point-wise convergence of the factors in the higher-order power method for tensors towards a critical point is given. It is obtained by applying established results from the theory ofExpand
...
1
2
3
4
...