Fast computation of robust subspace estimators

@article{CevallosValdiviezo2019FastCO,
  title={Fast computation of robust subspace estimators},
  author={Holger Cevallos-Valdiviezo and Stefan Van Aelst},
  journal={Comput. Stat. Data Anal.},
  year={2019},
  volume={134},
  pages={171-185}
}

Figures and Tables from this paper

Sparse Principal Component Analysis Based on Least Trimmed Squares
TLDR
A robust sparsePCA method is proposed to handle potential outliers in the data based on the least trimmed squares PCA method which provides robust but non-sparse PC estimates and the computation time is reduced to a great extent.
Robust variable screening for regression using factor profiling
TLDR
A robust screening method which uses a least trimmed squares method to estimate the latent factors and the factor‐profiled variables and outperforms the two nonrobust methods on contaminated data is proposed.

References

SHOWING 1-10 OF 43 REFERENCES
A comparison of three procedures for robust PCA in high dimensions
TLDR
Three procedures for robust Principal Components Analysis (PCA) are compared by means of a simulation study and an adjusted algorithm is presented that yields several PCA models in one single run.
S-Estimators for Functional Principal Component Analysis
Principal component analysis is a widely used technique that provides an optimal lower-dimensional approximation to multivariate or functional datasets. These approximations can be very useful in
ROBPCA: A New Approach to Robust Principal Component Analysis
TLDR
The ROBPCA approach, which combines projection pursuit ideas with robust scatter matrix estimation, yields more accurate estimates at noncontaminated datasets and more robust estimates at contaminated data.
ROBUST PRINCIPAL COMPONENT ANALYSIS BASED ON TRIMMING AROUND AFFINE SUBSPACES
Principal Component Analysis (PCA) is a widely used technique for reducing dimensionality of multivariate data. The principal component subspace is defined as the affine subspace of a given dimension
Robust Orthogonal Complement Principal Component Analysis
TLDR
A novel robust orthogonal complement principal component analysis (ROC-PCA) is proposed, which combines the popular sparsity-enforcing and low-rank regularization techniques to deal with row-wise outliers as well as element-wise outsiers.
Reinforced Robust Principal Component Pursuit
TLDR
It is argued that it is necessary to study the presence of outliers not only in the observed data matrix but also in the orthogonal complement subspace of the authentic principal subspace, because the latter can seriously skew the estimation of the principal components.
A Fast Algorithm for the Minimum Covariance Determinant Estimator
TLDR
For small datasets, FAST-MCD typically finds the exact MCD, whereas for larger datasets it gives more accurate results than existing algorithms and is faster by orders.
Coherence Pursuit: Fast, Simple, and Robust Principal Component Analysis
TLDR
CoP is the first robust PCA algorithm that is simultaneously non-iterative, provably robust to both unstructuring and structured outliers, and can tolerate a large number of unstructured outliers.
A Fast Algorithm for S-Regression Estimates
TLDR
This article proposes an analogous algorithm for computing S-estimators, that is based on a “local improvement” step of the resampling initial candidates, and performs a simulation study which shows that S-ESTimators computed with the fast-S algorithm compare favorably to the LTS-estimates computed withThe fast-LTS algorithm.
Robust nonlinear principal components
TLDR
A predictive approach in which a spline curve is fit minimizing a residual M-scale is proposed, which is almost as good as other proposals for row-wise contamination, and is better for element- wise contamination.
...
...