Corpus ID: 204837870

Deterministic tensor completion with hypergraph expanders

@article{Harris2019DeterministicTC,
  title={Deterministic tensor completion with hypergraph expanders},
  author={K. D. Harris and Yizhe Zhu},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.10692}
}
We provide a novel analysis of low rank tensor completion based on hypergraph expanders. As a proxy for rank, we minimize the max-quasinorm of the tensor, introduced by Ghadermarzy, Plan, and Yilmaz (2018), which generalizes the max-norm for matrices. Our analysis is deterministic and shows that the number of samples required to recover an order-$t$ tensor with at most $n$ entries per dimension is linear in $n$, under the assumption that the rank and order of the tensor are $O(1)$. As steps in… Expand

Figures from this paper

D ec 2 01 9 SPARSE RANDOM TENSORS : CONCENTRATION , REGULARIZATION AND APPLICATIONS
Abstract. We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-k inhomogeneous random tensor T with sparsity pmax ≥ c log nExpand
PR ] 2 0 N ov 2 01 9 SPARSE RANDOM TENSORS : CONCENTRATION , REGULARIZATION AND APPLICATIONS
Abstract. We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-k inhomogeneous random tensor T with sparsity pmax ≥ c log nExpand
Sparse random tensors: Concentration, regularization and applications
We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-$k$ inhomogeneous random tensor $T$ with sparsity $p_{\max}\geqExpand

References

SHOWING 1-10 OF 75 REFERENCES
Near-optimal sample complexity for convex tensor completion
TLDR
This paper proves that solving a constrained least squares estimation using either the convex atomic-norm or the nonconvex max-qnorm results in optimal sample complexity for the problem of low-rank tensor completion and shows that these bounds are nearly minimax rate-optimal. Expand
Spectral algorithms for tensor completion
TLDR
A new unfolding-based method is proposed, which outperforms naive ones for symmetric $k$-th order tensors of rank $r$ and complements this result with a different spectral algorithm for third-order tensors in the overcomplete regime. Expand
Provable Tensor Factorization with Missing Data
TLDR
A novel alternating minimization based method which iteratively refines estimates of the singular vectors which can recover a three-mode n × n → n dimensional rank-r tensor exactly from O(n3/2r5 log4 n) randomly sampled entries. Expand
Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery
TLDR
The new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly. Expand
Noisy Tensor Completion via the Sum-of-Squares Hierarchy
TLDR
The main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm by establishing a new connection between noisy tensor completion and the task of refuting random constant satisfaction problems. Expand
Tensor completion and low-n-rank tensor recovery via convex optimization
In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In an important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recoveryExpand
Tensor Completion Algorithms in Big Data Analytics
TLDR
A modern overview of recent advances in tensor completion algorithms from the perspective of big data analytics characterized by diverse variety, large volume, and high velocity is provided. Expand
Most Tensor Problems Are NP-Hard
TLDR
It is proved that multilinear (tensor) analogues of many efficiently computable problems in numerical linear algebra are NP-hard and how computing the combinatorial hyperdeterminant is NP-, #P-, and VNP-hard. Expand
Tensor completion for estimating missing values in visual data
TLDR
An algorithm to estimate missing values in tensors of visual data by laying out the theoretical foundations and building a working algorithm is proposed, which is more accurate and robust than heuristic approaches. Expand
Universal Matrix Completion
TLDR
This work shows that if the set of sampled indices come from the edges of a bipartite graph with large spectral gap, then the nuclear norm minimization based method exactly recovers all low-rank matrices that satisfy certain incoherence properties. Expand
...
1
2
3
4
5
...