Corpus ID: 218674300

Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent

@article{Tong2020AcceleratingIL,
  title={Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent},
  author={Tian Tong and Cong Ma and Yuejie Chi},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.08898}
}
Low-rank matrix estimation is a canonical problem that finds numerous applications in signal processing, machine learning and imaging science. A popular approach in practice is to factorize the matrix into two compact low-rank factors, and then optimize these factors directly via simple iterative methods such as gradient descent and alternating minimization. Despite nonconvexity, recent literatures have shown that these simple heuristics in fact achieve linear convergence when initialized… Expand
Beyond Procrustes: Balancing-free Gradient Descent for Asymmetric Low-Rank Matrix Sensing
TLDR
This paper justifies this theoretically for the matrix sensing problem, which aims to recover a low-rank matrix from a small number of linear measurements, as long as the measurement ensemble satisfies the restricted isometry property, gradient descent converges linearly without the need of explicitly promoting balancedness of the factors. Expand
Nonconvex Matrix Factorization From Rank-One Measurements
We consider the problem of recovering low-rank matrices from random rank-one measurements, which spans numerous applications including covariance sketching, phase retrieval, quantum state tomography,Expand
A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or aExpand
Manifold Gradient Descent Solves Multi-Channel Sparse Blind Deconvolution Provably and Efficiently
TLDR
This paper proposes a novel approach based on nonconvex optimization over the sphere manifold by minimizing a smooth surrogate of the sparsity-promoting loss function, and demonstrates that manifold gradient descent with random initializations will probably recover the filter. Expand
Lecture notes on non-convex algorithms for low-rank matrix recovery
Low-rank matrix recovery problems are inverse problems which naturally arise in various fields like signal processing, imaging and machine learning. They are non-convex and NP-hard in fullExpand
Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation from Incomplete Measurements
TLDR
This paper develops a scaled gradient descent (ScaledGD) algorithm to directly recover the tensor factors with tailored spectral initializations, and shows that it provably converges at a linear rate independent of the condition number of the ground truth tensor. Expand
Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton methodExpand
Uncertainty quantification for nonconvex tensor completion: Confidence intervals, heteroscedasticity and optimality
TLDR
The findings unveil the statistical optimality of nonconvex tensor completion: it attains un-improvable $\ell_{2}$ accuracy when estimating both the unknown tensor and the underlying tensor factors. Expand
Learning Mixtures of Low-Rank ModelsCorresponding author: Yuxin Chen (email: yuxin.chen@princeton.edu)
We study the problem of learning mixtures of low-rank models, i.e. reconstructing multiple low-rank matrices from unlabelled linear measurements of each. This problem enriches two widely studiedExpand
Spectral Methods for Data Science: A Statistical Perspective
TLDR
This monograph aims to present a systematic, comprehensive, yet accessible introduction to spectral methods from a modern statistical perspective, highlighting their algorithmic implications in diverse large-scale applications. Expand
...
1
2
...

References

SHOWING 1-10 OF 63 REFERENCES
Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees
TLDR
This work provides a simple set of conditions under which projected gradient descent, when given a suitable initialization, converges geometrically to a statistically useful solution to the factorized optimization problem with rank constraints. Expand
Low rank matrix completion by alternating steepest descent methods
Matrix completion involves recovering a matrix from a subset of its entries by utilizing interdependency between the entries, typically through low rank structure. Despite matrix completion requiringExpand
Beyond Procrustes: Balancing-free Gradient Descent for Asymmetric Low-Rank Matrix Sensing
TLDR
This paper justifies this theoretically for the matrix sensing problem, which aims to recover a low-rank matrix from a small number of linear measurements, as long as the measurement ensemble satisfies the restricted isometry property, gradient descent converges linearly without the need of explicitly promoting balancedness of the factors. Expand
Nonconvex Matrix Factorization From Rank-One Measurements
We consider the problem of recovering low-rank matrices from random rank-one measurements, which spans numerous applications including covariance sketching, phase retrieval, quantum state tomography,Expand
Low-Rank Matrix Recovery with Composite Optimization: Good Conditioning and Rapid Convergence
TLDR
This framework subsumes such important computational tasks as phase retrieval, blind deconvolution, quadratic sensing, matrix completion, and robust PCA and shows that nonsmooth penalty formulations do not suffer from the same type of ill-conditioning. Expand
Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation: Recent Theory and Fast Algorithms via Convex and Nonconvex Optimization
TLDR
A unified overview of recent advances in low-rank matrix estimation from incomplete measurements is provided, with attention paid to rigorous characterization of the performance of these algorithms and to problems where the lowrank matrix has additional structural properties that require new algorithmic designs and theoretical analysis. Expand
Guaranteed Matrix Completion via Nonconvex Factorization
  • Ruoyu Sun, Z. Luo
  • Computer Science
  • 2015 IEEE 56th Annual Symposium on Foundations of Computer Science
  • 2015
TLDR
This paper establishes a theoretical guarantee for the factorization based formulation to correctly recover the underlying low-rank matrix, and is the first one that provides exact recovery guarantee for many standard algorithms such as gradient descent, SGD and block coordinate gradient descent. Expand
Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
TLDR
This tutorial-style overview highlights the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees and reviews two contrasting approaches: two-stage algorithms, which consist of a tailored initialization step followed by successive refinement; and global landscape analysis and initialization-free algorithms. Expand
Low-rank matrix completion using alternating minimization
TLDR
This paper presents one of the first theoretical analyses of the performance of alternating minimization for matrix completion, and the related problem of matrix sensing, and shows that alternating minimizations guarantees faster convergence to the true matrix, while allowing a significantly simpler analysis. Expand
Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
TLDR
It is demonstrated that when the rank and the condition number of the unknown matrix are bounded by a constant, the convex programming approach achieves near-optimal estimation errors - in terms of the Euclidean loss, the entrywise loss, and the spectral norm loss - for a wide range of noise levels. Expand
...
1
2
3
4
5
...