# A test for independence via Bayesian nonparametric estimation of mutual information

@article{AlLabadi2020ATF, title={A test for independence via Bayesian nonparametric estimation of mutual information}, author={Luai Al-Labadi and Forough Fazeli Asl and Zahra Saberi}, journal={Canadian Journal of Statistics}, year={2020}, volume={50} }

Mutual information is a well‐known tool to measure the mutual dependence between variables. In this article, a Bayesian nonparametric estimator of mutual information is established by means of the Dirichlet process and the k‐nearest neighbour distance. As a result, an easy‐to‐implement test of independence is introduced through the relative belief ratio. Several theoretical properties of the approach are presented. The procedure is illustrated through various examples and is compared with its…

## 4 Citations

### A Bayesian Nonparametric Estimation of Mutual Information

- Computer ScienceArXiv
- 2021

The main goal of this paper is to provide an efficient estimator of the mutual information based on the approach of Al Labadi et.

### Accelerating Causal Inference and Feature Selection Methods through G-Test Computation Reuse

- Computer ScienceEntropy
- 2021

This method greatly improves the efficiency of applications that perform a series of G-tests on permutations of the same features, such as feature selection and causal inference applications because this decomposition allows for an intensive reuse of these partial results.

### Mutual information matrix based on Rényi entropy and application

- Computer ScienceNonlinear Dynamics
- 2022

### On robustness of the relative belief ratio and the strength of its evidence with respect to the geometric contamination prior

- Journal of the Korean Statistical Society
- 2022

## References

SHOWING 1-10 OF 63 REFERENCES

### Nonparametric independence testing via mutual information

- Computer Science, MathematicsBiometrika
- 2019

This work proposes a test of independence of two multivariate random vectors, given a sample from the underlying population, based on the estimation of mutual information, whose decomposition into joint and marginal entropies facilitates the use of recently-developed efficient entropy estimators derived from nearest neighbour distances.

### Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

- Mathematics, Computer Science
- 2019

This estimator is used to construct joint entropy and Shannon mutual information estimates of a pair of discrete random variables and is almost sure consistency and central limit Theorems are established.

### A Bayesian nonparametric estimation to entropy

- MathematicsBrazilian Journal of Probability and Statistics
- 2019

A Bayesian nonparametric estimator to entropy is proposed. The derivation of the new estimator relies on using the Dirichlet process and adapting the well-known frequentist estimators of Vasicek…

### Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet Process Mixtures.

- Computer ScienceElectronic journal of statistics
- 2016

This article proposes novel Bayesian nonparametric methods using Dirichlet Process Mixture models for detecting pairwise dependence between random variables while accounting for uncertainty in the form of the underlying distributions and presents Bayesian diagnostic measures for characterising evidence against a "null model of pairwise independence".

### Nonparametric goodness-of-fit

- Mathematics
- 1999

This paper develops an approach to testing the adequacy of both classical and Bayesian models given sample data. An important feature of the approach is that we are able to test the practical…

### A Bayesian nonparametric approach to testing for dependence between random variables

- Computer Science
- 2015

A Bayesian nonparametric procedure that leads to a tractable, explicit and analytic quantification of the relative evidence for dependence vs independence and uses Polya tree priors on the space of probability measures to embedded within a decision theoretic test for dependence.

### Efficient multivariate entropy estimation via $k$-nearest neighbour distances

- Mathematics, Computer ScienceThe Annals of Statistics
- 2019

This paper seeks entropy estimators that are efficient and achieve the local asymptotic minimax lower bound with respect to squared error loss, and proposes a new weighted averages of the estimators originally proposed by Kozachenko and Leonenko (1987).

### Goodness-of-fit tests based on the distance between the Dirichlet process and its base measure

- Mathematics
- 2014

The Dirichlet process is a fundamental tool in studying Bayesian nonparametric inference. The Dirichlet process has several sum representations, where each one of these representations highlights…

### Estimating mutual information.

- Computer SciencePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2004

Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.