Factor analysis and AIC

@article{Akaike1987FactorAA,
  title={Factor analysis and AIC},
  author={Hirotugu Akaike},
  journal={Psychometrika},
  year={1987},
  volume={52},
  pages={317-332}
}
  • H. Akaike
  • Published 1987
  • Mathematics
  • Psychometrika
The information criterion AIC was introduced to extend the method of maximum likelihood to the multimodel situation. It was obtained by relating the successful experience of the order determination of an autoregressive model to the determination of the number of factors in the maximum likelihood factor analysis. The use of the AIC criterion in the factor analysis is particularly interesting when it is viewed as the choice of a Bayesian model. This observation shows that the area of application… 
Application of AIC to Wald and Lagrange Multiplier Tests in Covariance Structure Analysis.
TLDR
Some efficient procedures for the use of AIC in covariance structure analysis are proposed, based on the backward search via the Wald test to impose constraints and the forward search through the Lagrange Multiplier rest to release constraints.
Bayesian Information Criterion and Selection of the Number of Factors in Factor Analysis Models
In maximum likelihood exploratory factor analysis, the estimates of unique variances can often turn out to be zero or negative, which makes no sense from a statistical point of view. In order to
Efficient Bayesian Model Averaging in Factor Analysis
TLDR
An efficient Bayesian approach for model selection and averaging in hierarchical models having one or more factor analytic components is proposed, which results in a highly efficient stochastic search factor selection algorithm (SSFS) for identifying good factor models and performing model-averaged inferences.
AIC model selection using Akaike weights
TLDR
It is demonstrated that AIC values can be easily transformed to so-called Akaike weights, which can be directly interpreted as conditional probabilities for each model.
Choosing an appropriate number of factors in factor analysis with incomplete data
Estimation of an oblique structure via penalized likelihood factor analysis
A Comparative Investigation on Model Selection in Independent Factor Analysis
TLDR
This paper further investigates BYY harmony learning in comparison with existing typical criteria, including Akaik's information criterion, the consistent Akaike’s information criterion (CAIC), the Bayesian inference criterion (BIC), and the cross-validation (CV) criterion on selection of the number of factors.
A Comparison of Ten Methods for Determining the Number of Factors in Exploratory Factor Analysis
Datalogix The effectiveness of 10 methods for estimating the number of factors were compared. These methods were the minimum average partial procedure, the likelihood ratio test, the Akaike
SAS Code to Select the Best Multiple Linear Regression Model for Multivariate Data Using Information Criteria
Multiple linear regression is a standard statistical tool that regresses p independent variables against a single dependent variable. The objective is to find a linear model that best predicts the
...
1
2
3
4
5
...