# On the Nuclear Norm and the Singular Value Decomposition of Tensors

@article{Derksen2016OnTN, title={On the Nuclear Norm and the Singular Value Decomposition of Tensors}, author={Harm Derksen}, journal={Foundations of Computational Mathematics}, year={2016}, volume={16}, pages={779-811} }

Finding the rank of a tensor is a problem that has many applications. Unfortunately, it is often very difficult to determine the rank of a given tensor. Inspired by the heuristics of convex relaxation, we consider the nuclear norm instead of the rank of a tensor. We determine the nuclear norm of various tensors of interest. Along the way, we also do a systematic study various measures of orthogonality in tensor product spaces and we give a new generalization of the singular value decomposition…

## 43 Citations

### Nuclear norm of higher-order tensors

- Computer Science, MathematicsMath. Comput.
- 2018

An analogue of Banach's theorem for tensor spectral norm and Comon's conjecture for Tensor rank is established --- for a symmetric tensor, its symmetric nuclear norm always equals its nuclear norm.

### Symmetric Tensor Nuclear Norms

- Computer ScienceSIAM J. Appl. Algebra Geom.
- 2017

This paper discusses how to compute symmetric tensor nuclear norms, depending on the tensor order and the ground field, and proposes methods that can be extended to nonsymmetric tensors.

### Bounds on the Spectral Norm and the Nuclear Norm of a Tensor Based on Tensor Partitions

- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2016

When a tensor is partitioned into its matrix slices, the inequalities provide polynomial-time worst-case approximation bounds for computing the spectral norm and the nuclear norm of the tensor.

### Tensor Ranks and Norms

- Computer Science, Mathematics
- 2022

The value of the recently introduced G-stable rank is calculated, the nuclear norm is investigated, some notions of stable ranks on tensors are introduced, built from common norms on tensor products, and how these stable ranks relate to other notions of tensor rank are introduced.

### Rank Properties and Computational Methods for Orthogonal Tensor Decompositions

- Computer Science, MathematicsJournal of Scientific Computing
- 2022

This work presents several properties of orthogonal rank, which are different from those of tensor rank in many aspects, and proposes an algorithm based on the augmented Lagrangian method that has a great advantage over the existing methods for strongly Orthogonal decompositions in terms of the approximation error.

### Algebraic Methods for Tensor Data

- Computer Science, MathematicsSIAM J. Appl. Algebra Geom.
- 2021

Numerical experiments are presented whose results show that the performance of the alternating least square algorithm for the low rank approximation of tensors can be improved using tensor amplification.

### On the tensor spectral p-norm and its dual norm via partitions

- MathematicsComput. Optim. Appl.
- 2020

A generalization of the spectral norm and the nuclear norm of a tensor via arbitrary tensor partitions, a much richer concept than block tensors, is presented.

### Completely positive tensor recovery with minimal nuclear value

- Computer ScienceComput. Optim. Appl.
- 2018

The CP-nuclear value of a completely positive (CP) tensor and its properties are introduced and a semidefinite relaxation algorithm is proposed for solving the minimal CP- nuclear-value tensor recovery.

### On norm compression inequalities for partitioned block tensors

- Computer Science, Mathematics
- 2020

It is proved that for the tensor spectral norm, the norm of the compressed tensor is an upper bound of the normOf the original tensor, and this result can be extended to a general class of Tensor spectral norms.

### Approximate Low-Rank Tensor Learning

- Computer Science
- 2014

This work establishes a formal optimization guarantee for a general low-rank tensor learning formulation by combining a simple approximation algorithm for the tensor spectral norm with the recent generalized conditional gradient.

## References

SHOWING 1-10 OF 57 REFERENCES

### A Multilinear Singular Value Decomposition

- MathematicsSIAM J. Matrix Anal. Appl.
- 2000

There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.

### On the Ranks and Border Ranks of Symmetric Tensors

- Mathematics, Computer ScienceFound. Comput. Math.
- 2010

Improved lower bounds for the rank of a symmetric tensor are provided by considering the singularities of the hypersurface defined by the polynomial.

### Tensor completion and low-n-rank tensor recovery via convex optimization

- Computer Science
- 2011

This paper uses the n-rank of a tensor as a sparsity measure and considers the low-n-rank tensor recovery problem, i.e. the problem of finding the tensor of the lowest n-Rank that fulfills some linear constraints.

### Most Tensor Problems Are NP-Hard

- Computer Science, MathematicsJACM
- 2013

It is proved that multilinear (tensor) analogues of many efficiently computable problems in numerical linear algebra are NP-hard and how computing the combinatorial hyperdeterminant is NP-, #P-, and VNP-hard.

### Powers of tensors and fast matrix multiplication

- Computer ScienceISSAC
- 2014

A method to analyze the powers of a given trilinear form (a special kind of algebraic construction also called a tensor) and obtain upper bounds on the asymptotic complexity of matrix multiplication and obtain the upper bound ω.

### Symmetric Tensors and Symmetric Tensor Rank

- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008

The notion of the generic symmetric rank is discussed, which, due to the work of Alexander and Hirschowitz, is now known for any values of dimension and order.

### Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem

- Mathematics, Computer ScienceSIAM J. Matrix Anal. Appl.
- 2008

It is argued that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations, and a natural way of overcoming the ill-posedness of the low-rank approximation problem is proposed by using weak solutions when true solutions do not exist.

### Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization

- Computer Science, MathematicsSIAM Rev.
- 2010

It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.