Corpus ID: 221090014

Statistical Query Lower Bounds for Tensor PCA

@article{Dudeja2020StatisticalQL,
  title={Statistical Query Lower Bounds for Tensor PCA},
  author={Rishabh Dudeja and Daniel J. Hsu},
  journal={arXiv: Statistics Theory},
  year={2020}
}
In the Tensor PCA problem introduced by Richard and Montanari (2014), one is given a dataset consisting of $n$ samples $\mathbf{T}_{1:n}$ of i.i.d. Gaussian tensors of order $k$ with the promise that $\mathbb{E}\mathbf{T}_1$ is a rank-1 tensor and $\|\mathbb{E} \mathbf{T}_1\| = 1$. The goal is to estimate $\mathbb{E} \mathbf{T}_1$. This problem exhibits a large conjectured hard phase when $k>2$: When $d \lesssim n \ll d^{\frac{k}{2}}$ it is information theoretically possible to estimate… Expand
Inference for Low-rank Tensors - No Need to Debias
TLDR
In the Tucker low-rank tensor PCA or regression model, provided with any estimates achieving some attainable error rate, the data-driven confidence regions for the singular subspace of the parameter tensor are developed based on the asymptotic distribution of an updated estimate by two-iteration alternating minimization. Expand
Statistical Query Algorithms and Low-Degree Tests Are Almost Equivalent
TLDR
This paper studies two of the most popular restricted computational models, the statistical query framework and low-degree polynomials, in the context of high-dimensional hypothesis testing, and finds that under mild conditions on the testing problem, the two classes of algorithms are essentially equivalent in power. Expand
Tensor Clustering with Planted Structures: Statistical Optimality and Computational Limits
TLDR
The sharp boundaries of signal-to-noise ratio are identified: for which CHC and ROHC detection/recovery are statistically possible and it is proved that polynomial-time algorithms cannot solve these problems under the computational hardness conjectures of hypergraphic planted clique detection and hyperg Graphic planted dense subgraph (HPDS) recovery. Expand
Generalized Low-rank plus Sparse Tensor Estimation by Fast Riemannian Optimization
We investigate a generalized framework to estimate a latent low-rank plus sparse tensor, where the low-rank tensor often captures the multi-way principal components and the sparse tensor accounts forExpand
A NEW FRAMEWORK FOR TENSOR PCA BASED ON TRACE INVARIANTS
  • 2020
We consider the Principal Component Analysis (PCA) problem for tensors T ∈ (Rn)⊗k of large dimension n and of arbitrary order k ≥ 3. It consists in recovering a spike v⊗k 0 (related to a signalExpand
On Support Recovery with Sparse CCA: Information Theoretic and Computational Limits
In this paper we consider asymptotically exact support recovery in the context of high dimensional and sparse Canonical Correlation Analysis (CCA). Our main results describe four regimes of interestExpand

References

SHOWING 1-10 OF 36 REFERENCES
The landscape of the spiked tensor model
We consider the problem of estimating a large rank-one tensor ${\boldsymbol u}^{\otimes k}\in({\mathbb R}^{n})^{\otimes k}$, $k\ge 3$ in Gaussian noise. Earlier work characterized a criticalExpand
Sum-of-Squares Certificates for Maxima of Random Tensors on the Sphere
TLDR
The above bound is the best possible up to lower order terms, namely the optimum of the level-$q$ SoS relaxation is at least A_{\max} \cdot \biggl(\frac{n}{q^{\,1+o(1)}}\biggr)^{q/4-1/2} \ . Expand
On Mean Estimation for General Norms with Statistical Queries
TLDR
Sharp upper and lower bounds are obtained for the statistical query complexity of this problem when the the underlying norm is symmetric as well as for Schatten-$p norms, answering two questions raised by Feldman, Guzman, and Vempala (SODA 2017). Expand
Tensor principal component analysis via sum-of-square proofs
TLDR
It is shown that degree-$4 sum-of-squares relaxations break down for $\tau \leq O(n^{3/4}/\log(n)^{1/4})$, which demonstrates that improving the current guarantees would require new techniques or might even be intractable. Expand
Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm
TLDR
A new norm is proposed called the subspace norm, which is based on the Kronecker products of factors obtained by the proposed simple estimator, and empirically demonstrates that the sub space norm achieves the nearly ideal denoising performance even with $H=O(1)$. Expand
Efficient Algorithms and Lower Bounds for Robust Linear Regression
TLDR
Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$. Expand
Dealing with Range Anxiety in Mean Estimation via Statistical Queries
TLDR
Algorithms for high dimensional mean estimation and stochastic convex optimization in these models that work in more general settings than previously known solutions are obtained. Expand
A General Characterization of the Statistical Query Complexity
TLDR
This work demonstrates that the complexity of solving general problems over distributions using SQ algorithms can be captured by a relatively simple notion of statistical dimension that is introduced, and is also the first to precisely characterize the necessary tolerance of queries. Expand
On the Limitation of Spectral Methods: From the Gaussian Hidden Clique Problem to Rank One Perturbations of Gaussian Tensors
TLDR
A lower bound on the critical signal-to-noise ratio below which a rank-one signal cannot be detected is established on the basis of a more general result on rank- one perturbations of the Gaussian tensors. Expand
A statistical model for tensor PCA
TLDR
It turns out that the Principal Component Analysis problem for large tensors of arbitrary order $k$ under a single-spike (or rank-one plus noise) model is possible as soon as the signal-to-noise ratio $\beta$ becomes larger than $C\sqrt{k\log k}$ (and in particular $\beta can remain bounded as the problem dimensions increase). Expand
...
1
2
3
4
...