# Statistical tests based on R\'{e}nyi entropy estimation

@inproceedings{Cadirci2021StatisticalTB, title={Statistical tests based on R\'\{e\}nyi entropy estimation}, author={Mehmet Siddik Cadirci and Dafydd Evans and Nikolai Leonenko and Oleg Seleznjev}, year={2021} }

Entropy and its various generalizations are important in many fields, including mathematical statistics, communication theory, physics and computer science, for characterizing the amount of information associated with a probability distribution. In this paper we propose goodness-offit statistics for the multivariate Student and multivariate Pearson type II distributions, based on the maximum entropy principle and a class of estimators for Rényi entropy based on nearest neighbour distances. We…

## Figures from this paper

## References

SHOWING 1-10 OF 27 REFERENCES

Efficient multivariate entropy estimation via $k$-nearest neighbour distances

- Mathematics, Computer ScienceThe Annals of Statistics
- 2019

This paper seeks entropy estimators that are efficient and achieve the local asymptotic minimax lower bound with respect to squared error loss, and proposes a new weighted averages of the estimators originally proposed by Kozachenko and Leonenko (1987).

Entropy-based test for generalised Gaussian distributions

- Computer Science, MathematicsComput. Stat. Data Anal.
- 2022

A class of Rényi information estimators for multidimensional densities

- Computer Science
- 2008

It is shown that entropies of any order q, including Shannon's entropy, can be estimated consistently with minimal assumptions on f and it is straightforward to extend the nearest-neighbor method to estimate the statistical distance between two distributions using one i.i.d. sample from each.

A new class of random vector entropy estimators and its applications in testing statistical hypotheses

- Mathematics, Computer Science
- 2005

A simulation study indicates that the test involving the proposed entropy estimate has higher power than other well-known competitors under heavy tailed alternatives which are frequently used in many financial applications.

Statistical Estimation of the Shannon Entropy

- MathematicsActa Mathematica Sinica, English Series
- 2018

The behavior of the Kozachenko–Leonenko estimates for the (differential) Shannon entropy is studied when the number of i.i.d. vector-valued observations tends to infinity. The asymptotic unbiasedness…

A computationally efficient estimator for mutual information

- Computer Science, MathematicsProceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
- 2008

This work investigates a class of non-parametric estimators for mutual information, based on the nearest neighbour structure of observations in both the joint and marginal spaces, and demonstrates that a well-known estimator of this type can be computationally expensive under certain conditions.

Demystifying fixed k-nearest neighbor information estimators

- Computer Science, Mathematics2017 IEEE International Symposium on Information Theory (ISIT)
- 2017

It is demonstrated that the KSG estimator is consistent and an upper bound on the rate of convergence of the ℓ2 error as a function of number of samples is identified, and it is argued that the performance benefits of the KSg estimator stems from a curious “correlation boosting” effect.

Multivariate T-Distributions and Their Applications

- Economics
- 2004

Almost all the results available in the literature on multivariate t-distributions published in the last 50 years are now collected together in this comprehensive reference. Because these…