Equitability, mutual information, and the maximal information coefficient

@article{Kinney2014EquitabilityMI,
  title={Equitability, mutual information, and the maximal information coefficient},
  author={J. Kinney and G. Atwal},
  journal={Proceedings of the National Academy of Sciences},
  year={2014},
  volume={111},
  pages={3354 - 3359}
}
  • J. Kinney, G. Atwal
  • Published 2014
  • Mathematics, Biology, Medicine
  • Proceedings of the National Academy of Sciences
Significance Attention has recently focused on a basic yet unresolved problem in statistics: How can one quantify the strength of a statistical association between two variables without bias for relationships of a specific form? Here we propose a way of mathematically formalizing this “equitability” criterion, using core concepts from information theory. This criterion is naturally satisfied by a fundamental information-theoretic measure of dependence called “mutual information.” By contrast, a… Expand
319 Citations
Theoretical Foundations of Equitability and the Maximal Information Coefficient
  • 8
  • PDF
Copula Correlation: An Equitable Dependence Measure and Extension of Pearson's Correlation
  • 25
  • Highly Influenced
  • PDF
Jackknife approach to the estimation of mutual information
  • 13
  • Highly Influenced
  • PDF
Measuring Dependence Powerfully and Equitably
  • 39
  • PDF
Equitability, interval estimation, and statistical power
  • 13
  • Highly Influenced
  • PDF
Part mutual information for quantifying direct associations in networks
  • 104
  • PDF
An empirical study of the maximal and total information coefficients and leading measures of dependence
  • 15
  • Highly Influenced
  • PDF
Cleaning up the record on the maximal information coefficient and equitability
  • 28
  • PDF
Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples
  • 3
  • PDF
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 63 REFERENCES
Equitability Analysis of the Maximal Information Coefficient, with Comparisons
  • 62
  • PDF
Comment on “ Detecting Novel Associations in Large Data Sets ”
  • 91
  • PDF
Statistical validation of mutual information calculations: comparison of alternative numerical algorithms.
  • 169
  • PDF
Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data.
  • 194
  • PDF
Estimation of Entropy and Mutual Information
  • L. Paninski
  • Mathematics, Computer Science
  • Neural Computation
  • 2003
  • 1,127
  • PDF
The mutual information: Detecting and evaluating dependencies between variables
  • 655
  • PDF
A Correlation for the 21st Century
  • T. Speed
  • Mathematics, Medicine
  • Science
  • 2011
  • 110
Estimating mutual information using B-spline functions – an improved similarity measure for analysing gene expression data
  • 261
Estimating mutual information.
  • 2,210
  • PDF
Parametric Inference in the Large Data Limit Using Maximally Informative Models
  • 16
  • PDF
...
1
2
3
4
5
...