A family of the information criteria using the phi-divergence for categorical data

@article{Ogasawara2018AFO,
  title={A family of the information criteria using the phi-divergence for categorical data},
  author={Haruhiko Ogasawara},
  journal={Comput. Stat. Data Anal.},
  year={2018},
  volume={124},
  pages={87-103}
}
  • H. Ogasawara
  • Published 1 August 2018
  • Computer Science
  • Comput. Stat. Data Anal.

Tables from this paper

Asymptotic cumulants of the minimum phi-divergence estimator for categorical data under possible model misspecification
  • H. Ogasawara
  • Mathematics
    Communications in Statistics - Theory and Methods
  • 2019
Abstract The asymptotic cumulants of the minimum phi-divergence estimators of the parameters in a model for categorical data are obtained up to the fourth order with the higher-order asymptotic

References

SHOWING 1-10 OF 58 REFERENCES
Minimum power-divergence estimator in three-way contingency tables
Cressie et al. (2000; 2003) introduced and studied a new family of statistics, based on the φ-divergence measure, for solving the problem of testing a nested sequence of loglinear models. In that
Size and power considerations for testing loglinear models using divergence test statistics
In this article, we assume that categorical data are distributed according to a multinomial distribution whose probabilities follow a loglinear model. The inference problem we consider is that of
Minimum $$\phi $$ϕ-Divergence Estimation in Constrained Latent Class Models for Binary Data
The main purpose of this paper is to introduce and study the behavior of minimum $$\phi $$ϕ-divergence estimators as an alternative to the maximum-likelihood estimator in latent class models for
A class of cross-validatory model selection criteria
In this paper, we define a class of cross-validatory model selection criteria as an estimator of the predictive risk function based on a discrepancy between a candidate model and the true model. For
Further analysts of the data by akaike' s information criterion and the finite corrections
Using Akaike's information criterion, three examples of statistical data are reanalyzed and show reasonably definite conclusions. One is concerned with the multiple comparison problem for the means
Minimum phi divergence estimator and hierarchical testing in loglinear models
TLDR
The minimum phi-divergence estimator is defined, which is seen to be a generalization of the maximum likelihood estimator, and which is the basis of two new statistics for solving the problem of testing a nested sequence of loglinear models.
Optimal Information Criteria Minimizing Their Asymptotic Mean Square Errors
Abstract Methods of minimizing the asymptotic mean square errors of information criteria using the Kullback-Leibler distance are shown. First, optimal multiplicative coefficients of bias adjustment
Modified AIC and Cp in multivariate linear regression
TLDR
In a simulation study it is verified that the modified AIC and modified C p provide better approximations to their risk functions, and better model selection, than A IC and C p.
...
...