Principal component analysis

  title={Principal component analysis},
  author={Herv{\'e} Abdi and Lynne J. Williams},
  journal={Wiley Interdisciplinary Reviews: Computational Statistics},
  • H. Abdi, L. Williams
  • Published 1 July 2010
  • Mathematics
  • Wiley Interdisciplinary Reviews: Computational Statistics
Principal component analysis (PCA) is a multivariate technique that analyzes a data table in which observations are described by several inter‐correlated quantitative dependent variables. Its goal is to extract the important information from the table, to represent it as a set of new orthogonal variables called principal components, and to display the pattern of similarity of the observations and of the variables as points in maps. The quality of the PCA model can be evaluated using cross… 
Introduction to principal components analysis.
  • Kristin L. Sainani
  • Computer Science
    PM & R : the journal of injury, function, and rehabilitation
  • 2014
Principal Component Analysis
This chapter shows how PCA arises naturally as the maximum likelihood solution to a particular form of a linear-Gaussian latent variable model, which is called probabilistic principal component analysis.
On the dependency between principal components: Application to determine the rank of a matrix in an evolutionary process
A new method for exploring the dependency between principal components of an evolutionary process is proposed and it showed that MIC could provide accurate estimation of chemical rank in the reasonable timescale rather than DC and also the published rank estimation methods, in the most situations.
Application of Multi-Dimensional Principal Component Analysis to Medical Data
The multi-dimensional PCA is applied to theMulti-dimensional medical data including the functional independence measure (FIM) score, and the results of experimental analysis are described.
Principal Component Analysis and Quasar Identication Techniques
This paper will focus on the PCA of a correlation matrix in order to extract emission line ratios most relevant to the classication.
For Peer Review Partial least squares regression
Partial least squares (PLS) regression is a recent technique that combines features from and generalizes principal component analysis (PCA) and multiple linear regression and extracts from the predictors a set of orthogonal factors called latent variables which have the best predictive power.
A Study of Effectiveness of Principal Component Analysis on Different Data Sets
This paper has taken 24 benchmark data sets from the University of California, Irvine (UCI) Machine Learning Repository and KEEL data set repository and shown how much information is retained by individual PC to show the effectiveness of PCA.
A Principal Component Analysis Algorithm Based on Dimension Reduction Window
A novel algorithm named DRWPCA is developed, inspired by the content of the correlation coefficient part of the digital feature of a random variable, and the sliding window model for traffic control in network engineering, that provides promising accuracy, higher ability to reduce dimension and preserves the original information of the data.
MFAg: a R package for carrying out the multiple factor analysis
In considering the study between groups of variables using a multivariate approach, the usual techniques are either limited or unviable to describe how distinct these groups are. The multiple factor
Principals about principal components in statistical genetics
The possibilities, limitations and role of PCs in ancestry prediction, genome-wide association studies, rare variants analyses, imputation strategies, meta-analysis and epistasis detection are focused on.


Principal Component Analysis: Application to Statistical Process Control
PCA can also be use d a a multivariate outlier detection method, especially by studying the last p rincipal components, which is useful in multidimensional quality control.
Partial least squares regression and projection on latent structure regression (PLS Regression)
Partial least squares (PLS) regression (a.k.a. projection on latent structures) is a recent technique that combines features from and generalizes principal component analysis (PCA) and multiple
Cross-Validatory Estimation of the Number of Components in Factor and Principal Components Models
By means of factor analysis (FA) or principal components analysis (PCA) a matrix Y with the elements y ik is approximated by the model Here the parameters α, β and θ express the systematic part of
Multiple Factor Analysis (MFA)
Multiple factor analysis (MFA) analyzes observations described by several “blocks" or sets of variables to seek the common structures present in all or some of these sets.
Cross-Validatory Choice of the Number of Components From a Principal Component Analysis
The method is based on successively predicting each element in the data matrix after deleting the corresponding row and column of the matrix, and makes use of recently published algorithms for updating a singular value decomposition.
The Eigen-Decomposition : Eigenvalues and Eigenvectors
  • H. Abdi
  • Mathematics, Computer Science
  • 2006
Eigenvectors and eigenvalues are numbers and vectors associated to square matrices, and together they provide the eigen-decomposition of a matrix which analyzes the structure of this matrix. Even
Approaches to determining the number of components to interpret from principal components analysis were compared. Heuristic procedures included: retaining components with eigenvalues (Xs) > 1 (i.e.,
Bootstrapping Principal Components Analysis: Reply to Mehlman Et Al.
Mehlman et al. (1995) identify a condition that may arise in various multivariate procedures, i.e., the reflection or reversal of the axis direction. They suggest that this condition may have led to
Multiple Correspondence Analysis
Multiple correspondence analysis (MCA) is an extension of correspondence analysis (CA) which allows one to analyze the pattern of relationships of several categorical dependent variables. As such, it