• Corpus ID: 235352841

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

@inproceedings{Kmmerle2021ASS,
  title={A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples},
  author={Christian K{\"u}mmerle and Claudio Mayrink Verdun},
  booktitle={International Conference on Machine Learning},
  year={2021}
}
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for… 

Figures from this paper

Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate

It is proved that a variant of IRLS converges with a global linear rate to a sparse solution, i.e., with a linear error decrease occurring immediately from any initialization, if the measurements fulfill the usual null space property assumption.

GNMR: A provable one-line algorithm for low rank matrix recovery

GNMR is presented β€” an extremely simple iterative algorithm for low rank matrix recovery, based on a Gauss-Newton linearization, which shows that for matrix completion with uniform sampling, GNMR performs better than several popular methods, especially when given very few observations close to the information limit.

Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion

For a symmetric ground truth and the Root Mean Square Error (RMSE) loss, it is proved that the preconditioned SGD converges to Η« -accuracy in O (log(1 /Η« )) iterations, with a rapid linear convergence rate as if the ground truth were perfectly conditioned with ΞΊ = 1 .

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

We advance both the theory and practice of robust (cid:96) p -quasinorm regression for p ∈ (0 , 1] by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying…

Lp Quasi-norm Minimization: Algorithm and Applications

A heuristic method for retrieving sparse approximate solutions of optimization problems via minimizing the β„“ p quasi-norm, using a proximal gradient step to mitigate the convex projection step and hence enhance the algorithm speed while proving its convergence.

Learning Transition Operators From Sparse Space-Time Samples

A suitable non-convex iterative reweighted least squares (IRLS) algorithm is developed, its quadratic local convergence is established, and it is established that spatial samples can be substituted by a comparable number of space-time samples.

PI-NLF: A Proportional-Integral Approach for Non-negative Latent Factor Analysis

The feasibility of boosting the performance of a non-negative learning algorithm through an error feedback controller is unveiled, demonstrating that a PI-NLF model outperforms the state-of-the-art models in both computational efficiency and estimation accuracy for missing data of an HDI matrix.

Missing Value Imputation With Low-Rank Matrix Completion in Single-Cell RNA-Seq Data by Considering Cell Heterogeneity

A novel method is developed, called single cell Gauss–Newton Gene expression Imputation (scGNGI), to impute the scRNA-seq expression matrices by using a low-rank matrix completion, which can better preserve gene expression variability among cells.

References

SHOWING 1-10 OF 97 REFERENCES

Low rank matrix completion by alternating steepest descent methods

Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent

This paper theoretically shows that ScaledGD achieves the best of both worlds: it converges linearly at a rate independent of the condition number of the low-rank matrix similar as alternating minimization, while maintaining the low per-iteration cost of gradient descent.

Rank 2r Iterative Least Squares: Efficient Recovery of Ill-Conditioned Low Rank Matrices from Few Entries

We present a new, simple and computationally efficient iterative method for low rank matrix completion. Our method is inspired by the class of factorization-type iterative algorithms, but…

Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization

An efficient implementation of an iteratively reweighted least squares algorithm for recovering a matrix from a small number of linear measurements designed for the simultaneous promotion of both a minimal nuclear norm and an approximately low-rank solution is presented.

Guaranteed Matrix Completion via Nonconvex Factorization

  • Ruoyu SunZ. Luo
  • Computer Science
    2015 IEEE 56th Annual Symposium on Foundations of Computer Science
  • 2015
This paper establishes a theoretical guarantee for the factorization based formulation to correctly recover the underlying low-rank matrix, and is the first one that provides exact recovery guarantee for many standard algorithms such as gradient descent, SGD and block coordinate gradient descent.

Harmonic Mean Iteratively Reweighted Least Squares for low-rank matrix recovery

The strategy HM-IRLS uses to optimize a non-convex Schatten-p penalization to promote low-rankness carries three major strengths, in particular for the matrix completion setting.

Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm

A low-rank factorization model is proposed and a nonlinear successive over-relaxation (SOR) algorithm is constructed that only requires solving a linear least squares problem per iteration to improve the capacity of solving large-scale problems.

Iterative reweighted algorithms for matrix rank minimization

This paper proposes a family of Iterative Reweighted Least Squares algorithms IRLS-p, and gives theoretical guarantees similar to those for nuclear norm minimization, that is, recovery of low-rank matrices under certain assumptions on the operator defining the constraints.

Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview

This tutorial-style overview highlights the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees and reviews two contrasting approaches: two-stage algorithms, which consist of a tailored initialization step followed by successive refinement; and global landscape analysis and initialization-free algorithms.

Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed 퓁q Minimization

This paper starts with a preliminary yet novel analysis for unconstrained $\ell_q$ minimization, which includes convergence, error bound, and local convergence behavior, and extends the algorithm and analysis to the recovery of low-rank matrices.
...