Robust principal component analysis?

@article{Cands2011RobustPC,
  title={Robust principal component analysis?},
  author={Emmanuel J. Cand{\`e}s and Xiaodong Li and Yuliang Ma and John Wright},
  journal={J. ACM},
  year={2011},
  volume={58},
  pages={11:1-11:37}
}
This article is about a curious phenomenon. Suppose we have a data matrix, which is the superposition of a low-rank component and a sparse component. Can we recover each component individually? We prove that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, simply minimize a weighted combination of the nuclear norm and of… Expand
Linear time Principal Component Pursuit and its extensions using ℓ1 filtering
TLDR
A novel algorithm, called @?"1 filtering, is proposed, for exactly solving PCP with an O(r^2(m+n)) complexity, where mxn is the size of data matrix and r is the rank of the matrix to recover, which is supposed to be much smaller than m and n. Expand
Robust principal component analysis?: Recovering low-rank matrices from sparse errors
The problem of recovering a low-rank data matrix from corrupted observations arises in many application areas, including computer vision, system identification, and bioinformatics. Recently it wasExpand
Stable Principal Component Pursuit
TLDR
This result shows that the proposed convex program recovers the low-rank matrix even though a positive fraction of its entries are arbitrarily corrupted, with an error bound proportional to the noise level, the first result that shows the classical Principal Component Analysis, optimal for small i.i.d. noise, can be made robust to gross sparse errors. Expand
Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization
TLDR
It is proved that most matrices A can be efficiently and exactly recovered from most error sign-and-support patterns by solving a simple convex program, for which it is given a fast and provably convergent algorithm. Expand
A Robust Local Linear Data Decomposition
TLDR
A new decomposition of Principal Component Pursuit represents the data matrix as the sum of a sparse additive component and a structured sparse representation using itself as a dictionary, which can describe data lying within a nonlinear manifold. Expand
Robust principal component analysis via re-weighted minimization algorithms
  • D. Katselis, C. Beck
  • Mathematics, Computer Science
  • 2015 54th IEEE Conference on Decision and Control (CDC)
  • 2015
TLDR
The proposed methods perform at least as well as the state-of-the-art schemes for Robust PCA, while they allow for larger rank and sparsity regimes of the component matrices under exact recovery requirements. Expand
Efficient algorithms for robust and stable principal component pursuit problems
TLDR
Numerical results on problems with millions of variables and constraints such as foreground extraction from surveillance video, shadow and specularity removal from face images and video denoising from heavily corrupted data show that the efficient algorithms developed are competitive to current state-of-the-art solvers for RPCP and SPCP in terms of accuracy and speed. Expand
Solving Principal Component Pursuit in Linear Time via $l_1$ Filtering
TLDR
It is proved that under some suitable conditions, this problem can be exactly solved by principal component pursuit (PCP), i.e., minimizing a combination of nuclear norm and l_1norm, and a novel algorithm, called $l_1$ filtering, is proposed, which is the first algorithm that can solve a nuclear norm minimization problem in linear time. Expand
Robust Principal Component Analysis with Side Information
TLDR
This paper proposes a convex problem to incorporate side information in robust PCA and shows that the low rank matrix can be exactly recovered via the proposed method under certain conditions. Expand
Real-time Robust Principal Components' Pursuit
  • Chenlu Qiu, N. Vaswani
  • Computer Science, Mathematics
  • 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2010
TLDR
A solution that automatically handles correlated sparse outliers is proposed that is motivated as a tool for video surveillance applications with the background image sequence forming the low rank part and the moving objects/persons/abnormalities forming the sparse part. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 91 REFERENCES
Fast Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix
This paper studies algorithms for solving the problem of recovering a low-rank matrix with a fraction of its entries arbitrarily corrupted. This problem can be viewed as a robust version of classicalExpand
Matrix Completion With Noise
TLDR
This paper surveys the novel literature on matrix completion and introduces novel results showing that matrix completion is provably accurate even when the few observed entries are corrupted with a small amount of noise, and shows that, in practice, nuclear-norm minimization accurately fills in the many missing entries of large low-rank matrices from just a few noisy samples. Expand
The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices
TLDR
The method of augmented Lagrange multipliers (ALM) is applied to solve the Robust PCA problem, namely recovering a low-rank matrix with an unknown fraction of its entries being arbitrarily corrupted, and it is proved the necessary and sufficient condition for the inexact ALM to converge globally. Expand
A Singular Value Thresholding Algorithm for Matrix Completion
TLDR
This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms. Expand
Robust L/sub 1/ norm factorization in the presence of outliers and missing data by alternative convex programming
  • Q. Ke, T. Kanade
  • Mathematics, Computer Science
  • 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05)
  • 2005
TLDR
This paper forms matrix factorization as a L/sub 1/ norm minimization problem that is solved efficiently by alternative convex programming that is robust without requiring initial weighting, handles missing data straightforwardly, and provides a framework in which constraints and prior knowledge can be conveniently incorporated. Expand
Exact Matrix Completion via Convex Optimization
TLDR
It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information. Expand
Sparse and low-rank matrix decomposition via alternating direction method
The problem of recovering the sparse and low-rank components of a matrix captures a broad spectrum of applications. Authors in [4] proposed the concept of ”rank-sparsity incoherence” to characterizeExpand
A Framework for Robust Subspace Learning
TLDR
The theory of Robust Subspace Learning (RSL) for linear models within a continuous optimization framework based on robust M-estimation is developed and applies to a variety of linear learning problems in computer vision including eigen-analysis and structure from motion. Expand
The Power of Convex Relaxation: Near-Optimal Matrix Completion
  • E. Candès, T. Tao
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2010
TLDR
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). Expand
Recovering Low-Rank Matrices From Few Coefficients in Any Basis
  • D. Gross
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 2011
TLDR
It is shown that an unknown matrix of rank can be efficiently reconstructed from only randomly sampled expansion coefficients with respect to any given matrix basis, which quantifies the “degree of incoherence” between the unknown matrix and the basis. Expand
...
1
2
3
4
5
...