On the Bahadur-Efficient Testing of Uniformity by Means of the Entropy

@article{Harremos2008OnTB,
  title={On the Bahadur-Efficient Testing of Uniformity by Means of the Entropy},
  author={Peter Harremo{\"e}s and Igor Vajda},
  journal={IEEE Transactions on Information Theory},
  year={2008},
  volume={54},
  pages={321-331}
}
  • P. Harremoës, I. Vajda
  • Published 2008
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
This paper compares the power divergence statistics of orders with the information divergence statistic in the problem of testing the uniformity of a distribution. In this problem, the information divergence statistic is equivalent to the entropy statistic. Extending some previously established results about information diagrams, it is proved that the information divergence statistic in this problem is more efficient in the Bahadur sense than any power divergence statistic of order . This means… 

Figures from this paper

Entropy Testing is Efficient

It is proved that in this problem of testing the uniformity of a distribution the information divergence statistic is more efficient in the Bahadur sense than any power divergence statistic of order alpha > 1.

On Bahadur Efficiency of Power Divergence Statistics

It is proved that the information divergence statistic is infinitely more Bahadur efficient than the power divergence statistics of the orders $\alpha >1$ as long as the sequence of alternatives is

Efficiency of entropy testing

It is shown that in a certain sense Shannon entropy is more efficient than Renyi entropy for alpha isin [0; 1].

Mutual information of contingency tables and related inequalities

  • P. Harremoës
  • Mathematics
    2014 IEEE International Symposium on Information Theory
  • 2014
The signed log-likelihood is introduced and it is demonstrated that its distribution function can be related to the distribution function of a standard Gaussian by inequalities, and a general conjecture about how close the signed Log-Likelihood is to a standardGaussian is formulated.

Measures of Qualitative Variation in the Case of Maximum Entropy

This study test the normality of various qualitative variation measures in general; VarNC and StDev statistics in the case of maximum uncertainty, and derives probability distribution functions of these statistics and proves that they are consistent.

The Rate Distortion Test of Normality

  • P. Harremoës
  • Computer Science
    2019 IEEE International Symposium on Information Theory (ISIT)
  • 2019
We use techniques from rate distortion theory in testing normality. The idea is first to do optimal compression with respect to squared Euclidean distance and then use information divergence of the

Ef ciency of Entropy Testing

It is shown that in a certain sense Shannon entropy is more ef cient than Rényi entropy for 2 ]0; 1[ : This indicates that the de nition of relative eF ciency given in [1] does not fully capture the notion of efciency.

Large Deviations of χ2 Divergence Errors on Partitions

We discuss Chernoff-type large deviation results for χ 2 divergence errors on partitions. In contrast to the total variation and the I-divergence, the χ 2-divergence has an unconventional large

On optimal two sample homogeneity tests for finite alphabets

  • J. Unnikrishnan
  • Mathematics
    2012 IEEE International Symposium on Information Theory Proceedings
  • 2012
It is argued that such homogeneity tests with provable optimality properties could potentially be better choices than the chi-square test in practice, and provided guidelines for choosing thresholds that guarantee an approximate false alarm constraint for finite length observation sequences.

Joint Range of Rényi Entropies

The exact range of the joined values of several Renyi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied. Like in the case when

References

SHOWING 1-10 OF 49 REFERENCES

On asymptotic properties of information-theoretic divergences

  • M. PardoI. Vajda
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2003
Mutual asymptotic equivalence is established within three classes of information-theoretic divergences of discrete probability distributions, namely, f-divergences for Csiszar, Bregman, and Burbea-Rao, for testing the goodness of fit when the hypothetic distribution is uniform.

On convergence of information contained in quantized observations

  • I. Vajda
  • Computer Science
    IEEE Trans. Inf. Theory
  • 2002
This paper proves the convergence of the reduced values of these functionals to their original unreduced values for various sequences P/sub n/ of partitions of the observation space for the most common types of partitions.

Large deviations of divergence measures on partitions

Information-theoretic methods in testing the goodness of fit

  • L. GyorfiG. MorvaiI. Vajda
  • Mathematics, Computer Science
    2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060)
  • 2000
A new approach to evaluating the efficiency of information-divergence-type statistics for testing the goodness of fit is presented, focused on the Bahadur efficiency.

Some Limit Theorems in Statistics

Moment-generating Functions Chernoff's Theorem The Kullback- Leibler Information Number Some Examples of Large Deviation Probabilities Stein's Lemma Asymptotic Effective Variances Exact Slopes of

Inequalities between entropy and index of coincidence derived from information diagrams

The main result of the paper is the determination of the precise range of the map P/spl rarr/(IC(P), H(P)), which gives rise to precise lower as well as upper bounds for the entropy function.

The Number of Classes in Chi-Squared Goodness-of-Fit Tests

Abstract The power of Pearson chi-squared and likelihood ratio goodness-of-fit tests based on different partitions is studied by considering families of densities “between” the null density and fixed

Asymptotic distributions of φ‐divergences of hypothetical and observed frequencies on refined partitions

For a wide class of goodness‐of‐fit statistics based on φ‐divergences between hypothetical cell probabilities and observed relative frequencies, the asymptotic normality is established under the

EFFICIENCIES OF CHI-SQUARE AND LIKELIHOOD RATIO GOODNESS-OF-FIT TESTS

The classical problem of choice of number of classes in testing goodness of fit is considered for a class of alternatives, for the chi-square and likelihood ratio statistics. Pitman and Bahadur