Factor analysis and AIC

@article{Akaike1987FactorAA,
  title={Factor analysis and AIC},
  author={Hirotugu Akaike},
  journal={Psychometrika},
  year={1987},
  volume={52},
  pages={317-332}
}
  • H. Akaike
  • Published 1 September 1987
  • Mathematics
  • Psychometrika
The information criterion AIC was introduced to extend the method of maximum likelihood to the multimodel situation. It was obtained by relating the successful experience of the order determination of an autoregressive model to the determination of the number of factors in the maximum likelihood factor analysis. The use of the AIC criterion in the factor analysis is particularly interesting when it is viewed as the choice of a Bayesian model. This observation shows that the area of application… 
EMPIRICAL ASSESSMENTS OF AIC PROCEDURE FOR MODEL SELECTION IN FACTOR ANALYSIS
Application of AIC to the number of factors problem in maximum likelihood factor analysis was investigated. Analysis of some empirical data sets suggested that nonconvergent cases and improper
Bayesian Selection on the Number of Factors in a Factor Analysis Model
This paper considers a Bayesian approach for selecting the number of factors in a factor analysis model with continuous and polvtomous variables. A procedure for computing the important statistic in
Bayesian Inference in Factor Analysis
We propose a new method for analyzing factor analysis models using a Bayesian approach. Normal theory is used for the sampling distribution, and we adopt a model with a full disturbance covariance
Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions
During the last fifteen years, Akaike's entropy-based Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the
Parsimonious Bayesian Factor Analysis when the Number of Factors is Unknown
TLDR
This work introduces a new and general set of identifiability conditions for factor models which handles the ordering problem associated with current common practice and leads to a factor loading matrix representation which is an intuitive and easy to implement factor selection scheme.
AIC model selection using Akaike weights
TLDR
It is demonstrated that AIC values can be easily transformed to so-called Akaike weights, which can be directly interpreted as conditional probabilities for each model.
On the information-based measure of covariance complexity and its application to the evaluation of multivariate linear models
This paper introduces a new information-theoretic measure of complexity called ICOMP as a decision rule for model selection and evaluation for multivariate linear models. The development of ICOMP is
A Bayesian Approach for Multigroup Nonlinear Factor Analysis
The main purpose of this article is to develop a Bayesian approach for a general multigroup nonlinear factor analysis model. Joint Bayesian estimates of the factor scores and the structural
AIC and Large Samples
I discuss the behavior of the Akaike Information Criterion in the limit when the sample size grows. I show the falsity of the claim made recently by Stanley Mulaik in Philosophy of Science that AIC
...
...

References

SHOWING 1-10 OF 23 REFERENCES
Estimation and tests of significance in factor analysis
A distinction is drawn between the method of principal components developed by Hotelling and the common factor analysis discussed in psychological literature both from the point of view of stochastic
A new look at the statistical model identification
The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as
Posterior analysis of the factor model
TLDR
It is argued here that, if all variables in the model are random, then Bayes' theorem provides the logical link between the data and the unobserved latent variables.
Some contributions to maximum likelihood factor analysis
A new computational method for the maximum likelihood solution in factor analysis is presented. This method takes into account the fact that the likelihood function may not have a maximum in a point
Structural analysis of covariance and correlation matrices
A general approach to the analysis of covariance structures is considered, in which the variances and covariances or correlations of the observed variables are directly expressed in terms of the
Likelihood and the Bayes procedure
TLDR
Numerical examples, including seasonal adjustment of time series, are given to illustrate the practical utility of the common-sense approach to Bayesian statistics proposed in this paper.
An Expert Model Selection Approach to Determine the “Best” Pattern Structure in Factor Analysis Models
TLDR
An expert data-analytic model selection approach to choose the number of factors and determine the “best” factor pattern structure among all possible patterns under the orthogonal factor model using Mallows’ Cp Criterion.
A Newton-Raphson algorithm for maximum likelihood factor analysis
This paper demonstrates the feasibility of using a Newton-Raphson algorithm to solve the likelihood equations which arise in maximum likelihood factor analysis. The algorithm leads to clean easily
Fitting autoregressive models for prediction
This is a preliminary report on a newly developed simple and practical procedure of statistical identification of predictors by using autoregressive models. The use of autoregressive representation
Bayesian estimation in unrestricted factor analysis: A treatment for heywood cases
A Bayesian procedure is given for estimation in unrestricted common factor analysis. A choice of the form of the prior distribution is justified. It is shown empirically that the procedure achieves
...
...