Estimation of an oblique structure via penalized likelihood factor analysis

@article{Hirose2014EstimationOA,
  title={Estimation of an oblique structure via penalized likelihood factor analysis},
  author={Kei Hirose and Michio Yamamoto},
  journal={Comput. Stat. Data Anal.},
  year={2014},
  volume={79},
  pages={120-132}
}

Tables from this paper

A penalized likelihood-based framework for single and multiple-group factor analysis models
TLDR
A general penalized likelihood-based estimation approach for normal linear factor analysis models based on differentiable approximations of non-differentiable penalties and a theoretically founded definition of degrees of freedom is proposed.
Simultaneous variable and factor selection via sparse group lasso in factor analysis
  • Y. Dang, Qing Wang
  • Computer Science
    Journal of Statistical Computation and Simulation
  • 2019
TLDR
Simulation results reveal that the proposed method can better identify the possibly sparse structure of the true factor loading matrix with higher estimation accuracy than existing methods.
Approximated Penalized Maximum Likelihood for Exploratory Factor Analysis: An Orthogonal Case
TLDR
An approximation to PML is proposed and shows that the approximation naturally produces a sparse loading matrix and more accurately estimates the factor loadings and the covariance matrix, in the sense of having a lower mean squared error than factor rotations, under various conditions.
On Estimation of Sparse Factor Loadings Using Distribution-free Approach
TLDR
This work proposes a method for obtaining sparse factor loadings without requiring any distributional assumption and out-performs the penalized likelihood factor analysis via nonconvex penalties as it provides smaller values of MSE even when the two methods have the same level of sparsity.
Graphical tool of sparse factor analysis
TLDR
An overview of several sparse factor analysis models is given, followed by a discussion of a relation between ordinary factor rotation and penalized maximum likelihood approaches, and a novel analyzing tool is introduced wherein a user can select a model that is easy to interpret and also possesses desirable values of goodness-of-fit indices based on the graphical representation of solution path.
Decoupling Shrinkage and Selection in Gaussian Linear Factor Analysis
TLDR
This paper proposes a decision-theoretic approach that brings to light the relation between a sparse representation of the loadings and factor dimension through a summary from information contained in the multivariate posterior.
A Penalized Likelihood Method for Structural Equation Modeling
TLDR
A penalized likelihood (PL) method for structural equation modeling (SEM) was proposed as a methodology for exploring the underlying relations among both observed and latent variables and an expectation-conditional maximization algorithm was developed to maximize the PL criterion.
Sparse and Simple Structure Estimation via Prenet Penalization.
TLDR
The prenet (product-based elastic net), a novel penalization method for factor analysis models, possesses a perfect simple structure, which is known as a desirable structure in terms of the simplicity of the loading matrix.
lslx: Semi-Confirmatory Structural Equation Modeling via Penalized Likelihood
TLDR
An R package called lslx is described that implements PL methods for semi-confirmatory structural equation modeling (SEM) that includes a quasi-Newton method and supports other advanced functionalities, including a two-stage method with auxiliary variables for missing data handling and a reparameterized multi-group SEM to explore population heterogeneity.
A Correlation Thresholding Algorithm for Learning Factor Analysis Models
TLDR
This work proposes a fast algorithm that simultaneously learns the number of latent factors and a model structure that leads to identifiable parameters and shows that the correlation thresholding algorithm is an accurate method for learning the structure of factor analysis models and is robust to violations of the authors' assumptions.
...
...

References

SHOWING 1-10 OF 39 REFERENCES
Sparse estimation via nonconcave penalized likelihood in factor analysis model
TLDR
It is shown that the penalized likelihood procedure can be viewed as a generalization of the traditional two-step approach, and the proposed methodology can produce sparser solutions than the rotation technique.
A Penalized Maximum Likelihood Approach to Sparse Factor Analysis
TLDR
An l1 penalization method is introduced for performing sparse factor analysis in which factor loadings naturally adopt a sparse representation, greatly facilitating the interpretation of the fitted factor model.
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
TLDR
In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Sparse factor analysis via likelihood and ℓ1-regularization
  • L. Ning, T. Georgiou
  • Computer Science
    IEEE Conference on Decision and Control and European Control Conference
  • 2011
TLDR
An algorithm is proposed which weighs in an ℓ1-regularization term that induces sparsity of the linear model (factor) against a likelihood term that quantifies distance of the model to the sample covariance and compares favorably against standard techniques of factor analysis.
Bayesian Information Criterion and Selection of the Number of Factors in Factor Analysis Models
In maximum likelihood exploratory factor analysis, the estimates of unique variances can often turn out to be zero or negative, which makes no sense from a statistical point of view. In order to
Regression Shrinkage and Selection via the Lasso
TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
COORDINATE DESCENT ALGORITHMS FOR NONCONVEX PENALIZED REGRESSION, WITH APPLICATIONS TO BIOLOGICAL FEATURE SELECTION.
TLDR
The potential of coordinate descent algorithms for fitting models, establishing theoretical convergence properties and demonstrating that they are significantly faster than competing approaches are demonstrated, and the numerical results suggest that MCP is the preferred approach among the three methods.
On the “degrees of freedom” of the lasso
TLDR
The number of nonzero coefficients is an unbiased estimate for the degrees of freedom of the lasso—a conclusion that requires no special assumption on the predictors and the unbiased estimator is shown to be asymptotically consistent.
Some contributions to maximum likelihood factor analysis
A new computational method for the maximum likelihood solution in factor analysis is presented. This method takes into account the fact that the likelihood function may not have a maximum in a point
On Model Selection Consistency of Lasso
TLDR
It is proved that a single condition, which is called the Irrepresentable Condition, is almost necessary and sufficient for Lasso to select the true model both in the classical fixed p setting and in the large p setting as the sample size n gets large.
...
...