On Some Properties of Goodness of Fit Measures Based on Statistical Entropy

Abstract

Goodness of fit tests can be categorized under several ways. One categorization may be due to the differences between observed and expected frequencies. The other categorization may be based upon the differences between some values of distribution functions. Still the other one may be based upon the divergence of one distribution from the other. Some widely used and well known divergences like Kullback-Leibler divergence or Jeffreys divergence are based on entropy concepts. In this study, we compare some basic goodness of fit tests in terms of their statistical properties with some applications.

5 Figures and Tables

Cite this paper

@inproceedings{Evren2012OnSP, title={On Some Properties of Goodness of Fit Measures Based on Statistical Entropy}, author={Atif Evren and Elif Tuna}, year={2012} }