A family of the information criteria using the phi-divergence for categorical data
@article{Ogasawara2018AFO, title={A family of the information criteria using the phi-divergence for categorical data}, author={Haruhiko Ogasawara}, journal={Comput. Stat. Data Anal.}, year={2018}, volume={124}, pages={87-103} }
One Citation
Asymptotic cumulants of the minimum phi-divergence estimator for categorical data under possible model misspecification
- MathematicsCommunications in Statistics - Theory and Methods
- 2019
Abstract The asymptotic cumulants of the minimum phi-divergence estimators of the parameters in a model for categorical data are obtained up to the fourth order with the higher-order asymptotic…
References
SHOWING 1-10 OF 58 REFERENCES
Bias correction of the Akaike information criterion in factor analysis
- MathematicsJ. Multivar. Anal.
- 2016
Minimum power-divergence estimator in three-way contingency tables
- Mathematics
- 2003
Cressie et al. (2000; 2003) introduced and studied a new family of statistics, based on the φ-divergence measure, for solving the problem of testing a nested sequence of loglinear models. In that…
Size and power considerations for testing loglinear models using divergence test statistics
- Mathematics
- 2003
In this article, we assume that categorical data are distributed according to a multinomial distribution whose probabilities follow a loglinear model. The inference problem we consider is that of…
Minimum $$\phi $$ϕ-Divergence Estimation in Constrained Latent Class Models for Binary Data
- Mathematics, Computer SciencePsychometrika
- 2015
The main purpose of this paper is to introduce and study the behavior of minimum $$\phi $$ϕ-divergence estimators as an alternative to the maximum-likelihood estimator in latent class models for…
A class of cross-validatory model selection criteria
- Mathematics
- 2013
In this paper, we define a class of cross-validatory model selection criteria as an estimator of the predictive risk function based on a discrepancy between a candidate model and the true model. For…
Further analysts of the data by akaike' s information criterion and the finite corrections
- Mathematics
- 1978
Using Akaike's information criterion, three examples of statistical data are reanalyzed and show reasonably definite conclusions. One is concerned with the multiple comparison problem for the means…
Minimum phi divergence estimator and hierarchical testing in loglinear models
- Mathematics, Computer Science
- 2000
The minimum phi-divergence estimator is defined, which is seen to be a generalization of the maximum likelihood estimator, and which is the basis of two new statistics for solving the problem of testing a nested sequence of loglinear models.
Optimal Information Criteria Minimizing Their Asymptotic Mean Square Errors
- Mathematics
- 2016
Abstract Methods of minimizing the asymptotic mean square errors of information criteria using the Kullback-Leibler distance are shown. First, optimal multiplicative coefficients of bias adjustment…
Modified AIC and Cp in multivariate linear regression
- Computer Science
- 1997
In a simulation study it is verified that the modified AIC and modified C p provide better approximations to their risk functions, and better model selection, than A IC and C p.