Factor analysis and AIC
@article{Akaike1987FactorAA, title={Factor analysis and AIC}, author={Hirotugu Akaike}, journal={Psychometrika}, year={1987}, volume={52}, pages={317-332} }
The information criterion AIC was introduced to extend the method of maximum likelihood to the multimodel situation. It was obtained by relating the successful experience of the order determination of an autoregressive model to the determination of the number of factors in the maximum likelihood factor analysis. The use of the AIC criterion in the factor analysis is particularly interesting when it is viewed as the choice of a Bayesian model. This observation shows that the area of application…
1,482 Citations
Application of AIC to Wald and Lagrange Multiplier Tests in Covariance Structure Analysis.
- MathematicsMultivariate behavioral research
- 1996
Some efficient procedures for the use of AIC in covariance structure analysis are proposed, based on the backward search via the Wald test to impose constraints and the forward search through the Lagrange Multiplier rest to release constraints.
Bayesian Information Criterion and Selection of the Number of Factors in Factor Analysis Models
- Mathematics
- 2021
In maximum likelihood exploratory factor analysis, the estimates of unique variances can often turn out to be zero or negative, which makes no sense from a statistical point of view. In order to…
Efficient Bayesian Model Averaging in Factor Analysis
- Computer Science
- 2006
An efficient Bayesian approach for model selection and averaging in hierarchical models having one or more factor analytic components is proposed, which results in a highly efficient stochastic search factor selection algorithm (SSFS) for identifying good factor models and performing model-averaged inferences.
AIC model selection using Akaike weights
- Computer SciencePsychonomic bulletin & review
- 2004
It is demonstrated that AIC values can be easily transformed to so-called Akaike weights, which can be directly interpreted as conditional probabilities for each model.
Choosing an appropriate number of factors in factor analysis with incomplete data
- Computer ScienceComput. Stat. Data Anal.
- 2008
Estimation of an oblique structure via penalized likelihood factor analysis
- Computer ScienceComput. Stat. Data Anal.
- 2014
A Comparative Investigation on Model Selection in Independent Factor Analysis
- Computer ScienceJ. Math. Model. Algorithms
- 2006
This paper further investigates BYY harmony learning in comparison with existing typical criteria, including Akaik's information criterion, the consistent Akaike’s information criterion (CAIC), the Bayesian inference criterion (BIC), and the cross-validation (CV) criterion on selection of the number of factors.
A Comparison of Ten Methods for Determining the Number of Factors in Exploratory Factor Analysis
- Psychology
- 2013
Datalogix The effectiveness of 10 methods for estimating the number of factors were compared. These methods were the minimum average partial procedure, the likelihood ratio test, the Akaike…
A family of the information criteria using the phi-divergence for categorical data
- Computer ScienceComput. Stat. Data Anal.
- 2018
SAS Code to Select the Best Multiple Linear Regression Model for Multivariate Data Using Information Criteria
- Computer Science
- 2005
Multiple linear regression is a standard statistical tool that regresses p independent variables against a single dependent variable. The objective is to find a linear model that best predicts the…