• Corpus ID: 235265669

Statistical tests based on R\'{e}nyi entropy estimation

@inproceedings{Cadirci2021StatisticalTB,
  title={Statistical tests based on R\'\{e\}nyi entropy estimation},
  author={Mehmet Siddik Cadirci and Dafydd Evans and Nikolai Leonenko and Oleg Seleznjev},
  year={2021}
}
Entropy and its various generalizations are important in many fields, including mathematical statistics, communication theory, physics and computer science, for characterizing the amount of information associated with a probability distribution. In this paper we propose goodness-offit statistics for the multivariate Student and multivariate Pearson type II distributions, based on the maximum entropy principle and a class of estimators for Rényi entropy based on nearest neighbour distances. We… 

References

SHOWING 1-10 OF 27 REFERENCES
Efficient multivariate entropy estimation via $k$-nearest neighbour distances
TLDR
This paper seeks entropy estimators that are efficient and achieve the local asymptotic minimax lower bound with respect to squared error loss, and proposes a new weighted averages of the estimators originally proposed by Kozachenko and Leonenko (1987).
Entropy-based test for generalised Gaussian distributions
A class of Rényi information estimators for multidimensional densities
TLDR
It is shown that entropies of any order q, including Shannon's entropy, can be estimated consistently with minimal assumptions on f and it is straightforward to extend the nearest-neighbor method to estimate the statistical distance between two distributions using one i.i.d. sample from each.
A new class of random vector entropy estimators and its applications in testing statistical hypotheses
TLDR
A simulation study indicates that the test involving the proposed entropy estimate has higher power than other well-known competitors under heavy tailed alternatives which are frequently used in many financial applications.
Statistical Estimation of the Shannon Entropy
The behavior of the Kozachenko–Leonenko estimates for the (differential) Shannon entropy is studied when the number of i.i.d. vector-valued observations tends to infinity. The asymptotic unbiasedness
A computationally efficient estimator for mutual information
  • Dafydd Evans
  • Computer Science, Mathematics
    Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2008
TLDR
This work investigates a class of non-parametric estimators for mutual information, based on the nearest neighbour structure of observations in both the joint and marginal spaces, and demonstrates that a well-known estimator of this type can be computationally expensive under certain conditions.
Some results concerning maximum Renyi entropy distributions
On the Kozachenko-Leonenko entropy estimator
Demystifying fixed k-nearest neighbor information estimators
TLDR
It is demonstrated that the KSG estimator is consistent and an upper bound on the rate of convergence of the ℓ2 error as a function of number of samples is identified, and it is argued that the performance benefits of the KSg estimator stems from a curious “correlation boosting” effect.
Multivariate T-Distributions and Their Applications
Almost all the results available in the literature on multivariate t-distributions published in the last 50 years are now collected together in this comprehensive reference. Because these
...
...