Corpus ID: 235458139

Square Root Principal Component Pursuit: Tuning-Free Noisy Robust Matrix Recovery

@article{Zhang2021SquareRP,
  title={Square Root Principal Component Pursuit: Tuning-Free Noisy Robust Matrix Recovery},
  author={Junhui Zhang and Jingkai Yan and John Wright},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.09211}
}
We propose a new framework – Square Root Principal Component Pursuit – for low-rank matrix recovery from observations corrupted with noise and outliers. Inspired by the square root Lasso, this new formulation does not require prior knowledge of the noise level. We show that a single, universal choice of the regularization parameter suffices to achieve reconstruction error proportional to the (a priori unknown) noise level. In comparison, previous formulations such as stable PCP rely on noise… Expand

References

SHOWING 1-10 OF 32 REFERENCES
Stable Principal Component Pursuit
TLDR
This result shows that the proposed convex program recovers the low-rank matrix even though a positive fraction of its entries are arbitrarily corrupted, with an error bound proportional to the noise level, the first result that shows the classical Principal Component Analysis, optimal for small i.i.d. noise, can be made robust to gross sparse errors. Expand
Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming
We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors p is large, possibly much larger than n, but only s regressors areExpand
Bridging Convex and Nonconvex Optimization in Robust PCA: Noise, Outliers, and Missing Data
This paper delivers improved theoretical guarantees for the convex programming approach in low-rank matrix estimation, in the presence of (1) random noise, (2) gross sparse outliers, and (3) missingExpand
Non-convex Robust PCA
TLDR
A new provable method for robust PCA, where the task is to recover a low-rank matrix, which is corrupted with sparse perturbations, which represents one of the few instances of global convergence guarantees for non-convex methods. Expand
Low-Rank and Sparse Structure Pursuit via Alternating Minimization
TLDR
A notion of bounded di↵erence of gradients is defined, based on which it is rigorously proved that with suitable initialization, the proposed nonconvex optimization algorithm enjoys linear convergence to the global optima and exactly recovers the underlying low rank and sparse matrices under standard conditions such as incoherence and sparsity conditions. Expand
Robust Matrix Decomposition with Outliers
TLDR
This work studies conditions under which recovering a given observation matrix as the sum of a low-rank matrix and a sparse matrix is possible via a combination of $\ell_1$ norm and trace norm minimization and obtains stronger recovery guarantees than previous studies. Expand
Low-Rank Matrix Recovery From Errors and Erasures
TLDR
A new unified performance guarantee on when minimizing nuclear norm plus l1 norm succeeds in exact recovery is provided, which provides the first guarantees for 1) recovery when the authors observe a vanishing fraction of entries of a corrupted matrix, and 2) deterministic matrix completion. Expand
Robust Recovery via Implicit Bias of Discrepant Learning Rates for Double Over-parameterization
TLDR
With a double over-parameterization for both the low-rank matrix and sparse corruption, gradient descent with discrepant learning rates provably recovers the underlying matrix even without prior knowledge on neither rank of the matrix nor sparsity of the corruption. Expand
Robust matrix completion
This paper considers the problem of estimation of a low-rank matrix when most of its entries are not observed and some of the observed entries are corrupted. The observations are noisy realizationsExpand
Robust video denoising using low rank matrix completion
TLDR
The robustness and effectiveness of the proposed Denoising algorithm on removing mixed noise, e.g. heavy Gaussian noise mixed with impulsive noise, is validated in the experiments and the proposed approach compares favorably against some existing video denoising algorithms. Expand
...
1
2
3
4
...