# Robust PCA With Partial Subspace Knowledge

@article{Zhan2015RobustPW,
title={Robust PCA With Partial Subspace Knowledge},
author={Jinchun Zhan and Namrata Vaswani},
journal={IEEE Transactions on Signal Processing},
year={2015},
volume={63},
pages={3332-3347}
}
• Published 6 March 2014
• Computer Science
• IEEE Transactions on Signal Processing
In recent work, robust Principal Components Analysis (PCA) has been posed as a problem of recovering a low-rank matrix L and a sparse matrix S from their sum, M: = L + S and a provably exact convex optimization solution called PCP has been proposed. This work studies the following problem. Suppose that we have partial knowledge about the column space of the low rank matrix L. Can we use this information to improve the PCP solution, i.e., allow recovery under weaker assumptions? We propose here…
42 Citations

## Figures from this paper

### Sparse and low rank signal recovery with partial knowledge

A novel “online” RPCA algorithm based on the recently introduced Recursive Projected Compressive Sensing is developed and studied and its correctness result is derived, which shows that modified-PCP indeed requires significantly weaker incoherence assumptions than PCP, when the available subspace knowledge is accurate.

### MEDRoP: Memory-Efficient Dynamic Robust PCA

• Computer Science
ArXiv
• 2017
Memory-Efficient Dynamic Robust PCA provably solves dynamic RPCA under weakened versions of standard RPCA assumptions, a mild assumption on slow subspace change, and two simple assumptions (a lower bound on most outlier magnitudes and mutual independence of the true data vectors).

### New Results for Provable Dynamic Robust PCA

• Computer Science
ArXiv
• 2017
This work provides the first guarantee for dynamic RPCA that holds under (weakened) standard RPCA assumptions and a realistic model of slow subspace change, and analyzes an existing method called ReProCS.

### Static and Dynamic Robust PCA via Low-Rank + Sparse Matrix Decomposition: A Review

• Computer Science
ArXiv
• 2018
This article provides an exhaustive review of the last decade of literature on RPCA and its dynamic counterpart (robust subspace tracking), along with describing their theoretical guarantees, discussing the pros and cons of various approaches, and providing empirical comparisons of performance and speed.

### Static and Dynamic Robust PCA and Matrix Completion: A Review

• Computer Science
Proceedings of the IEEE
• 2018
This paper provides an exhaustive review of the last decade of literature on RPCA and its dynamic counterpart (RST), along with describing their theoretical guarantees, discussing the pros and cons of various approaches, and providing empirical comparisons of performance and speed.

### Exact Decomposition of Joint Low Rankness and Local Smoothness Plus Sparse Matrices

• Computer Science
IEEE transactions on pattern analysis and machine intelligence
• 2022
It is proved that under some mild assumptions, the proposed 3DCTV-RPCA model can decompose both components exactly, which should be the first theoretical guarantee among all such related methods combining low rankness and local smoothness.

### Nearly Optimal Robust Subspace Tracking

• Computer Science
ICML
• 2018
This work studies the robust subspace tracking (RST) problem and obtains one of the first two provable guarantees for it, and develops a recursive projected compressive sensing algorithm that is called Nearly Optimal RST via ReProCS (ReProCS-NORST) because its tracking delay is nearly optimal.

### Provable Dynamic Robust PCA or Robust Subspace Tracking

• Computer Science
IEEE Transactions on Information Theory
• 2019
This paper provides the first guarantee for dynamic RPCA that holds under weakened versions of standard RPCA assumptions, slow subspace change, and a lower bound assumption on most outlier magnitudes.

### Fast Robust Subspace Tracking via PCA in Sparse Data-Dependent Noise

• Computer Science
IEEE Journal on Selected Areas in Information Theory
• 2020
This work introduces a “fast” mini-batch robust ST solution that is provably correct under mild assumptions and introduces a novel non-asymptotic guarantee for PCA in linearly data-dependent noise.

### Nearly Optimal Robust Subspace Tracking and Dynamic Robust PCA

• Computer Science
ICML 2018
• 2017
This work proposes a recursive projected compressive sensing based algorithm called NORST (Nearly Optimal RST) and proves that it solves both the RST and the dynamic RPCA problems, under weakened standard RPCA assumptions, slow subspace change, and two simple extra assumptions (outlier magnitudes lower bounded, and mutual independence of the data vectors).

## References

SHOWING 1-10 OF 64 REFERENCES

### Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations

• Computer Science
SIAM J. Optim.
• 2011
This paper studies the recovery task in the general settings that only a fraction of entries of the matrix can be observed and the observation is corrupted by both impulsive and Gaussian noise, and shows that the resulting model falls into the applicable scope of the classical augmented Lagrangian method.

### Stable Principal Component Pursuit

• Computer Science
2010 IEEE International Symposium on Information Theory
• 2010
This result shows that the proposed convex program recovers the low-rank matrix even though a positive fraction of its entries are arbitrarily corrupted, with an error bound proportional to the noise level, the first result that shows the classical Principal Component Analysis, optimal for small i.i.d. noise, can be made robust to gross sparse errors.

### Robust principal component analysis?

• Computer Science
JACM
• 2011
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis.

### A novel M-estimator for robust PCA

• Computer Science
J. Mach. Learn. Res.
• 2014
The minimizer and its subspace are interpreted as robust versions of the empirical inverse covariance and the PCA subspace respectively and compared with many other algorithms for robust PCA on synthetic and real data sets and demonstrate state-of-the-art speed and accuracy.

### Robust Matrix Decomposition with Outliers

• Computer Science
ArXiv
• 2010
This work studies conditions under which recovering a given observation matrix as the sum of a low-rank matrix and a sparse matrix is possible via a combination of $\ell_1$ norm and trace norm minimization and obtains stronger recovery guarantees than previous studies.

### Recursive Robust PCA or Recursive Sparse Recovery in Large but Structured Noise

• Computer Science
IEEE Transactions on Information Theory
• 2014
A simple modification of the original ReProCS idea, which assumes knowledge of a subspace change model on the Lt's, and shows that the proposed approach can exactly recover the support set of St at all times, and the reconstruction errors of both St and Lt are upper bounded by a time-invariant and small value.

### Robust Matrix Decomposition With Sparse Corruptions

• Computer Science
IEEE Transactions on Information Theory
• 2011
This work studies conditions under which recovering a given observation matrix as the sum of a low-rank matrix and a sparse matrix is possible via a combination of ℓ1 norm and trace norm minimization and obtains stronger recovery guarantees than previous studies.

### Exact Matrix Completion via Convex Optimization

• Computer Science, Mathematics
Found. Comput. Math.
• 2009
It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information.

### Robust PCA and subspace tracking from incomplete observations using $$\ell _0$$ℓ0-surrogates

• Computer Science
Comput. Stat.
• 2014
This work proposes a method that allows for reconstructing and tracking a subspace of upper-bounded dimension from incomplete and corrupted observations and can cope with more outliers and with an underlying matrix of higher rank than other state-of-the-art methods.