RES-PCA: A Scalable Approach to Recovering Low-Rank Matrices

@article{Peng2019RESPCAAS,
  title={RES-PCA: A Scalable Approach to Recovering Low-Rank Matrices},
  author={Chong Peng and Chenglizhao Chen and Zhao Kang and Jianbo Li and Qiang Shawn Cheng},
  journal={2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2019},
  pages={7309-7317}
}
Robust principal component analysis (RPCA) has drawn significant attentions due to its powerful capability in recovering low-rank matrices as well as successful appplications in various real world problems. The current state-of-the-art algorithms usually need to solve singular value decomposition of large matrices, which generally has at least a quadratic or even cubic complexity. This drawback has limited the application of RPCA in solving real world problems. To combat this drawback, in this… Expand
Robust principal component analysis: A factorization-based approach with linear complexity
TLDR
This paper proposes a novel RPCA model based on matrix tri-factorization, which only needs the computation of SVDs for very small matrices and reduces the complexity of RPCA to be linear and makes it fully scalable. Expand
Self-Paced Two-dimensional PCA
TLDR
This work proposes a novel method called SP2DPCA, which progresses from ‘easy’ to ‘complex’ samples and looks for optimal projection matrix and filters out outliers iteratively, demonstrating the robustness nature of this method. Expand
Scaled Simplex Representation for Subspace Clustering
TLDR
The proposed SSR-based SC (SSRSC) model is reformulated as a linear equality-constrained problem, which is solved efficiently under the alternating direction method of multipliers framework and outperforms the state-of-the-art SC methods on accuracy. Expand
Self-paced Principal Component Analysis
TLDR
This work proposes a novel method called Self-paced PCA (SPCA), which finds an optimal projection matrix and filters out outliers iteratively and can improve the state-of-the-art results considerably. Expand
Feature Selection Embedded Robust K-Means
TLDR
From extensive experimental results, it is observed that superior clustering performance to the baseline methods is observed, which implies the effectiveness of the proposed method. Expand
A Unified Weight Learning and Low-Rank Regression Model for Robust Face Recognition
TLDR
A unified sparse weight learning and low-rank approximation regression model is applied to the robust face recognition in the presence of varying types and levels of corruptions, such as random pixel corruptions and block occlusions, or disguise. Expand
Bio-Inspired Structure Representation Based Cross-View Discriminative Subspace Learning via Simultaneous Local and Global Alignment
TLDR
A novel cross-view discriminative feature subspace learning method inspired by layered visual perception from human, which utilizes a separable low-rank self-representation model to disentangle the class and view structure layers. Expand
Dynamic background subtraction with masked RPCA
TLDR
This thesis proposes masked RPCA to process backgrounds containing moving textures and compared the proposed method with state-of-art, end-to-end non-deep learning methods to demonstrate its advantages. Expand
Generalized Locally-Linear Embedding: A Neural Network Implementation
TLDR
This work introduces an extra fully-connected layer whose weight works as a reconstruction coefficient (i.e., relation among the samples) so that the latent representation can well preserve the neighborhood structure in locally-linear embedding. Expand
Discriminative Ridge Machine: A Classifier for High-Dimensional Data or Imbalanced Data
  • Chong Peng, Q. Cheng
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2021
In this article, we introduce a discriminative ridge regression approach to supervised classification. It estimates a representation model while accounting for discriminativeness between classes,Expand
...
1
2
3
...

References

SHOWING 1-10 OF 30 REFERENCES
A Fast Factorization-Based Approach to Robust PCA
TLDR
A novel factorization-based model of RPCA, which has a complexity of O(kdn), where k is an upper bound of thetrue rank, and can be used as a light-weight, scalable tool for RPCA in the absence of the precise value of the true rank. Expand
Non-convex Robust PCA
TLDR
A new provable method for robust PCA, where the task is to recover a low-rank matrix, which is corrupted with sparse perturbations, which represents one of the few instances of global convergence guarantees for non-convex methods. Expand
The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices
TLDR
The method of augmented Lagrange multipliers (ALM) is applied to solve the Robust PCA problem, namely recovering a low-rank matrix with an unknown fraction of its entries being arbitrarily corrupted, and it is proved the necessary and sufficient condition for the inexact ALM to converge globally. Expand
Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization
TLDR
It is proved that most matrices A can be efficiently and exactly recovered from most error sign-and-support patterns by solving a simple convex program, for which it is given a fast and provably convergent algorithm. Expand
Background Recovery by Fixed-Rank Robust Principal Component Analysis
TLDR
Comprehensive tests show that, by fixing the rank of the low-rank matrix to a known value, the fixed-rank algorithm produces more reliable and accurate results than existing low- rank RPCA algorithm. Expand
Robust PCA Via Nonconvex Rank Approximation
TLDR
This work proposes a nonconvex rank approximation that outperforms current state-of-the-art algorithms in both accuracy and efficiency and develops an efficient augmented Lagrange multiplier based optimization algorithm. Expand
Stable Principal Component Pursuit
TLDR
This result shows that the proposed convex program recovers the low-rank matrix even though a positive fraction of its entries are arbitrarily corrupted, with an error bound proportional to the noise level, the first result that shows the classical Principal Component Analysis, optimal for small i.i.d. noise, can be made robust to gross sparse errors. Expand
Robust principal component analysis?
TLDR
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis. Expand
Robust PCA via Outlier Pursuit
TLDR
This work presents an efficient convex optimization-based algorithm that it calls outlier pursuit, which under some mild assumptions on the uncorrupted points recovers the exact optimal low-dimensional subspace and identifies the corrupted points. Expand
Subspace Clustering Using Log-determinant Rank Approximation
TLDR
The log-determinant low-rank optimization method is used to solve subspace clustering problem, for which the method of augmented Lagrangian multipliers is applied to optimize this non-convex rank approximation-based objective function and closes-form solutions for all subproblems of minimizing different variables alternatively. Expand
...
1
2
3
...