Exactly Robust Kernel Principal Component Analysis
@article{Fan2020ExactlyRK, title={Exactly Robust Kernel Principal Component Analysis}, author={Jicong Fan and Tommy W. S. Chow}, journal={IEEE Transactions on Neural Networks and Learning Systems}, year={2020}, volume={31}, pages={749-761} }
Robust principal component analysis (RPCA) can recover low-rank matrices when they are corrupted by sparse noises. In practice, many matrices are, however, of high rank and, hence, cannot be recovered by RPCA. We propose a novel method called robust kernel principal component analysis (RKPCA) to decompose a partially corrupted matrix as a sparse matrix plus a high- or full-rank matrix with low latent dimensionality. RKPCA can be applied to many problems such as noise removal and subspace…
Figures and Tables from this paper
23 Citations
Robust Non-Linear Matrix Factorization for Dictionary Learning, Denoising, and Clustering
- Computer ScienceIEEE Transactions on Signal Processing
- 2021
This work proposes a new robust nonlinear factorization method called Robust Non-Linear Matrix Factorization (RNLMF), which constructs a dictionary for the data space by factoring a kernelized feature space and scales to matrices with thousands of rows and columns.
Robust Kernel Principal Component Analysis With ℓ2,1-Regularized Loss Minimization
- Computer ScienceIEEE Access
- 2020
This paper proposes a robust method for KPCA with a reformulation in Euclidean space to construct a robust K PCA method, where an error measurement is introduced into the loss function, and $\ell _{2,1}$ -regularization is added to the lossfunction.
Factor Group-Sparse Regularization for Efficient Low-Rank Matrix Recovery
- Computer ScienceNeurIPS
- 2019
Compared to the max norm and the factored formulation of the nuclear norm, factor group-sparse regularizers are more efficient, accurate, and robust to the initial guess of rank.
A Robust Method for Kernel Principal Component Analysis
- Computer Science2020 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)
- 2020
This work proposes a robust loss function which combined with $l_{2,1}$ regularization for KPCA, inspired by sparse PCA via variable projection, and demonstrates the effectiveness of the proposed method through the numerical example for outlier detection.
A generalized least-squares approach regularized with graph embedding for dimensionality reduction
- Computer SciencePattern Recognit.
- 2020
Enhanced PSSV for Incomplete Data Analysis
- Computer ScienceIEEE Access
- 2020
This work imposes the variance regularization in PSSV by analyzing P SSV and truncated nuclear norm and shows the effectiveness of the proposed approach on background extraction from the incomplete videos and data, image denoising, and clustering.
Surface Defects Detection Using Non-convex Total Variation Regularized RPCA With Kernelization
- Computer ScienceIEEE Transactions on Instrumentation and Measurement
- 2021
An unsupervised surface defect detection method based on nonconvex total variation (TV) regularized RPCA with kernelization, named KRPCA-NTV, which outperforms competing methods in terms of accuracy and generalizability is proposed.
Principal Component Analysis Based on T$\ell_1$-norm Maximization
- Computer Science
- 2020
Numerical experiments have shown that the proposed PCA based on T$\ell_1$-norm is superior than PCA-p and $\ell_p$SPCA as well as PCA, PCA-$\ell-1$ obviously.
Fault Detection of Wind Turbines by Subspace Reconstruction-Based Robust Kernel Principal Component Analysis
- EngineeringIEEE Transactions on Instrumentation and Measurement
- 2021
A fault detection frame of subspace reconstruction-based robust kernel principal component analysis (SR-RKPCA) model for wind turbines SCADA data to extract nonlinear features under discontinuous interference to improve the stability of the fault detection model of wind turbines.
Robust 2DPCA by Tℓ₁ Criterion Maximization for Image Recognition
- Computer ScienceIEEE Access
- 2021
Two-dimensional principal component analysis based on aninline-formula is proposed, and the experimental results have shown that its performance is superior to that of classical 2DPCA, 2 DPCA-L1, 2D PCAL1-S, N-2-DPC a, G2DPCa, and Angle-2D PCA.
References
SHOWING 1-10 OF 57 REFERENCES
Reinforced Robust Principal Component Pursuit
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2018
It is argued that it is necessary to study the presence of outliers not only in the observed data matrix but also in the orthogonal complement subspace of the authentic principal subspace, because the latter can seriously skew the estimation of the principal components.
Robust principal component analysis?
- Computer ScienceJACM
- 2011
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis.
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
- Computer ScienceICML
- 2006
Experiments on several real-life datasets show R1-PCA can effectively handle outliers and it is shown that L1-norm K-means leads to poor results while R2-K-MEans outperforms standard K-Means.
Robust Subspace Clustering With Complex Noise
- Computer ScienceIEEE Transactions on Image Processing
- 2015
Experimental results on three commonly used data sets show that the proposed novel optimization model for robust subspace clustering outperforms state-of-the-art subspace clusters methods.
The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices
- Computer ScienceArXiv
- 2010
A closed form solution to robust subspace estimation and clustering
- Computer Science, MathematicsCVPR 2011
- 2011
This work uses an augmented Lagrangian optimization framework, which requires a combination of the proposed polynomial thresholding operator with the more traditional shrinkage-thresholding operator, to solve the problem of fitting one or more subspace to a collection of data points drawn from the subspaces and corrupted by noise/outliers.
Robust Principal Component Analysis with Complex Noise
- Computer Science, GeologyICML
- 2014
This work proposes a generative RPCA model under the Bayesian framework by modeling data noise as a mixture of Gaussians (MoG), a universal approximator to continuous distributions and thus the model is able to fit a wide range of noises such as Laplacian, Gaussian, sparse noises and any combinations of them.
Robust Kernel Low-Rank Representation
- Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2016
This work proposes the robust kernel LRR (RKLRR) approach, and develops an efficient optimization algorithm to solve it based on the alternating direction method, and shows that both the subproblems in the optimization algorithm can be efficiently and exactly solved.
Sparse subspace clustering for data with missing entries and high-rank matrix completion
- Computer Science, EngineeringNeural Networks
- 2017
Learning Structured Low-Rank Representation via Matrix Factorization
- Computer ScienceAISTATS
- 2016
This paper proposes to learn structured LRR by factorizing the nuclear norm regularized matrix, which leads to the proposed non-convex formulation NLRR, a general framework for unifying a variety of popular algorithms including LRR, dictionary learning, robust principal component analysis, sparse subspace clustering, etc.