Corpus ID: 1787057

Estimating Mutual Information by Local Gaussian Approximation

@inproceedings{Gao2015EstimatingMI,
  title={Estimating Mutual Information by Local Gaussian Approximation},
  author={Shuyang Gao and Greg Ver Steeg and Aram Galstyan},
  booktitle={UAI},
  year={2015}
}
  • Shuyang Gao, Greg Ver Steeg, Aram Galstyan
  • Published in UAI 2015
  • Computer Science, Mathematics, Physics
  • Estimating mutual information (MI) from samples is a fundamental problem in statistics, machine learning, and data analysis. Recently it was shown that a popular class of non-parametric MI estimators perform very poorly for strongly dependent variables and have sample complexity that scales exponentially with the true MI. This undesired behavior was attributed to the reliance of those estimators on local uniformity of the underlying (and unknown) probability density function. Here we present a… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 20 CITATIONS

    Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation

    VIEW 10 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Analysis of KNN Information Estimators for Smooth Distributions

    • Puning Zhao, Lifeng Lai
    • Mathematics, Computer Science
    • 2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
    • 2018
    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    Minimax Estimation of Quadratic Fourier Functionals

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Nonparametric Direct Entropy Difference Estimation

    VIEW 3 EXCERPTS
    CITES METHODS

    Bridging Mixture Model Estimation and Information Bounds Using I-MMSE

    VIEW 3 EXCERPTS
    CITES METHODS

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 25 REFERENCES

    A non-parametric k-nearest neighbour entropy estimator

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Estimating mutual information.

    VIEW 7 EXCERPTS
    HIGHLY INFLUENTIAL

    Ensemble estimation of multivariate f-divergence

    VIEW 2 EXCERPTS

    Equitability, mutual information, and the maximal information coefficient.