ISLET: Fast and Optimal Low-rank Tensor Regression via Importance Sketching
@article{Zhang2020ISLETFA, title={ISLET: Fast and Optimal Low-rank Tensor Regression via Importance Sketching}, author={Anru Zhang and Yuetian Luo and G. Raskutti and M. Yuan}, journal={SIAM J. Math. Data Sci.}, year={2020}, volume={2}, pages={444-479} }
In this paper, we develop a novel procedure for low-rank tensor regression, namely \emph{\underline{I}mportance \underline{S}ketching \underline{L}ow-rank \underline{E}stimation for \underline{T}ensors} (ISLET). The central idea behind ISLET is \emph{importance sketching}, i.e., carefully designed sketches based on both the responses and low-dimensional structure of the parameter of interest. We show that the proposed method is sharply minimax optimal in terms of the mean-squared error under… CONTINUE READING
Figures, Tables, and Topics from this paper
10 Citations
Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence
- Mathematics, Computer Science
- ArXiv
- 2020
- PDF
Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2020
- 22
- PDF
Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition
- Mathematics, Computer Science
- ArXiv
- 2020
- 3
- PDF
An Optimal Statistical and Computational Framework for Generalized Tensor Estimation
- Mathematics, Computer Science
- ArXiv
- 2020
- 7
- PDF
A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration
- Mathematics, Computer Science
- ArXiv
- 2020
- 2
- PDF
Tensor Clustering with Planted Structures: Statistical Optimality and Computational Limits
- Mathematics, Computer Science
- ArXiv
- 2020
- 9
- PDF
Spectral Methods for Data Science: A Statistical Perspective
- Computer Science, Mathematics
- ArXiv
- 2020
- 1
- Highly Influenced
- PDF
References
SHOWING 1-10 OF 151 REFERENCES
Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2020
- 22
- PDF
Statistically Optimal and Computationally Efficient Low Rank Tensor Completion from Noisy Entries
- Mathematics, Computer Science
- ArXiv
- 2017
- 23
- PDF
Sharper Bounds for Regression and Low-Rank Approximation with Regularization
- Computer Science, Mathematics
- ArXiv
- 2016
- 11
- PDF
Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery
- Mathematics, Computer Science
- ICML
- 2014
- 217
- Highly Influential
- PDF