# Principal Component Analysis and Optimization: A Tutorial

@inproceedings{Reris2015PrincipalCA, title={Principal Component Analysis and Optimization: A Tutorial}, author={Robert Reris and J. Paul Brooks}, year={2015} }

Principal component analysis (PCA) is one of the most widely used multivariate tech- niques in statistics. It is commonly used to reduce the dimensionality of data in order to examine its underlying structure and the covariance/correlation structure of a set of variables. While singular value decomposition provides a simple means for identi- cation of the principal components (PCs) for classical PCA, solutions achieved in this manner may not possess certain desirable properties including…

## 39 Citations

### Sparse kernel feature extraction via support vector learning

- Computer SciencePattern Recognit. Lett.
- 2018

### Robust and Sparse Kernel PCA and Its Outlier Map

- Computer ScienceICBIP '18
- 2018

A two-stage algorithm was proposed: a robust distance was computed to identify the uncontaminated data set, followed by estimating the best-fit ellipsoid to these data for an informative and concise representation, and a kernel PCA outlier map was proposed to display and classify the outliers.

### Feature selection based on star coordinates plots associated with eigenvalue problems

- Computer ScienceThe Visual Computer
- 2020

A new feature relevance measure for star coordinates plots associated with the class of linear dimensionality reduction mappings defined through the solutions of eigenvalue problems, such as linear discriminant analysis or principal component analysis is proposed.

### Feature selection based on star coordinates plots associated with eigenvalue problems

- Computer ScienceVis. Comput.
- 2021

A new feature relevance measure for star coordinates plots associated with the class of linear dimensionality reduction mappings defined through the solutions of eigenvalue problems, such as linear discriminant analysis or principal component analysis is proposed.

### Estimating L 1-Norm Best-Fit Lines for Data

- Computer Science
- 2017

This paper presents a procedure to estimate the L1-norm best-fit onedimensional subspace (a line through the origin) to data in < based on an optimization criterion involving linear programming but which can be performed using simple ratios and sortings.

### Random selection of factors approximately preserves correlation structure in a linear factor model

- Computer Science, Mathematics
- 2017

A statistical factor model is developed, the random factor model, in which factors are chosen at random based on the random projection method, which enables derivation of probabilistic bounds for the accuracy of therandom factor representation of time-series, their cross-correlations and covariances.

### Random selection of factors preserves the correlation structure in a linear factor model to a high degree

- Computer Science, MathematicsPloS one
- 2018

A statistical factor model is developed, the random factor model, in which factors are chosen stochastically based on the random projection method for derivation of probabilistic bounds for the accuracy of therandom factor representation of time-series, their cross-correlations and covariances.

### Principal component analysis and singular value decomposition used for a numerical sensitivity analysis of a complex drawn part

- Materials ScienceThe International Journal of Advanced Manufacturing Technology
- 2017

The numerical forecasting of car body construction processes is already being used in industry to provide support in the ramp-up process. However, long calculation times are stretching the finite…

### Principal component analysis and singular value decomposition used for a numerical sensitivity analysis of a complex drawn part

- Materials Science
- 2018

The numerical forecasting of car body construction processes is already being used in industry to provide support in the ramp-up process. However, long calculation times are stretching the finite…

### Approximating L 1-Norm Best-Fit Lines

- Mathematics, Computer Science
- 2019

Sufficient conditions are provided for a deterministic algorithm for estimating an L1-norm best-fit one-dimensional subspace and an equivalence is established between the algorithm, which involves the calculation of several weighted medians, and independently-derived algorithms based on finding L 1-norm solutions to overdetermined system of linear equations.

## References

SHOWING 1-10 OF 34 REFERENCES

### K-means clustering via principal component analysis

- Computer ScienceICML
- 2004

It is proved that principal components are the continuous solutions to the discrete cluster membership indicators for K-means clustering, which indicates that unsupervised dimension reduction is closely related to unsuper supervised learning.

### A Pure L1-norm Principal Component Analysis.

- Computer ScienceComputational statistics & data analysis
- 2013

Tests show that L1-PCA* is the indicated procedure in the presence of unbalanced outlier contamination and the application of this idea that fits data to subspaces of successively smaller dimension is presented.

### Principal Component Analysis

- Mathematics, GeologyInternational Encyclopedia of Statistical Science
- 1986

Introduction * Properties of Population Principal Components * Properties of Sample Principal Components * Interpreting Principal Components: Examples * Graphical Representation of Data Using…

### Robust Principal Component Analysis with Non-Greedy l1-Norm Maximization

- Computer ScienceIJCAI
- 2011

Experimental results on real world datasets show that the nongreedy method always obtains much better solution than that of the greedy method, and then a robust principal component analysis with non-greedy l1-norm maximization is proposed.

### Robust principal component analysis?

- Computer ScienceJACM
- 2011

It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis.

### A Generalized Least-Square Matrix Decomposition

- Computer Science
- 2014

By finding the best low-rank approximation of the data with respect to a transposable quadratic norm, the generalized least-square matrix decomposition (GMD), directly accounts for structural relationships and is demonstrated for dimension reduction, signal recovery, and feature selection with high-dimensional structured data.

### Principal Component Analysis Based on L1-Norm Maximization

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2008

A method of principal component analysis (PCA) based on a new L1-norm optimization technique which is robust to outliers and invariant to rotations and also proven to find a locally maximal solution.

### Spectral Relaxation for K-means Clustering

- Computer Science, MathematicsNIPS
- 2001

It is shown that a relaxed version of the trace maximization problem possesses global optimal solutions which can be obtained by Computing a partial eigendecomposition of the Gram matrix, and the cluster assignment for each data vectors can be found by computing a pivoted QR decomposition ofThe eigenvector matrix.

### Principal Component Analysis

- Environmental ScienceEncyclopedia of Database Systems
- 2009

The Karhunen-Lo eve basis functions, more frequently referred to as principal components or empirical orthogonal functions (EOFs), of the noise response of the climate system are an important tool…