Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy

@article{Zhang2022AsymptoticNF,
  title={Asymptotic Normality for Plug-In Estimators of Generalized Shannon’s Entropy},
  author={Jialin Zhang and Jingyi Shi},
  journal={Entropy},
  year={2022},
  volume={24}
}
Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized… 

Figures from this paper

Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction

  • Jialin Zhang
  • Computer Science
    Machine Learning and Knowledge Extraction
  • 2022
Recent developments in entropic statistics are reviewed, including estimation of Shannon’s entropy and its functionals (such as mutual information and Kullback–Leibler divergence), concepts ofEntropic basis, generalized Shannon's entropy (and its functional), and their estimations and potential applications in machine learning and knowledge extraction.

References

SHOWING 1-10 OF 28 REFERENCES

Entropy Estimation in Turing's Perspective

A new nonparametric estimator of Shannon's entropy on a countable alphabet is proposed and analyzed against the well-known plug-in estimator, which has a bias decaying exponentially in n.

Asymptotic normality for plug-in estimators of diversity indices on countable alphabets

ABSTRACT The plug-in estimator is one of the most popular approaches to the estimation of diversity indices. In this paper, we study its asymptotic distribution for a large class of diversity indices

Infinite Shannon entropy

This work develops several particularly simple, elementary, and useful bounds, and also provides some asymptotic estimates, leading to necessary and sufficient conditions for the occurrence of infinite Shannon entropy.

Generalized Mutual Information

This article proposes a family of generalized mutual information whose members are indexed by a positive integer n, with the nth member being the mutual information of nth order.

A Brief Review of Generalized Entropies

This review focuses on the so-called generalized entropies, which from a mathematical point of view are nonnegative functions defined on probability distributions that satisfy the first three Shannon–Khinchin axioms: continuity, maximality and expansibility.

A Normal Law for the Plug-in Estimator of Entropy

This paper establishes a sufficient condition for the asymptotic normality of the plug-in estimator of Shannon's entropy defined on a countable alphabet. The sufficient condition covers a range of

Asymptotic Normality of an Entropy Estimator With Exponentially Decaying Bias

  • Zhiyi Zhang
  • Mathematics
    IEEE Transactions on Information Theory
  • 2013
Asymptotic normality is established of an entropy estimator with an exponentially decaying bias on any finite alphabet and it is shown that the nonparametric estimator is asymptotically efficient.

Estimation of population size in entropic perspective

Abstract Nonparametric estimation of population size is a long standing and difficult problem. It is difficult because, particularly from a likelihood perspective, the underlying distribution could

A mathematical theory of communication

In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a

Correction to: Tree-Based Analysis: A Practical Approach to Create Clinical Decision-Making Tools.

Qualitative and practical aspects of tree-based methods, with a focus on diagnostic classification (binary outcome) and prognostication (censored survival outcome), and creating an ensemble of trees improves prediction accuracy and addresses instability in a single tree are reviewed.