Corpus ID: 195767524

Estimating Information-Theoretic Quantities with Random Forests

@article{Guo2019EstimatingIQ,
  title={Estimating Information-Theoretic Quantities with Random Forests},
  author={Richard Guo and Cencheng Shen and Joshua T. Vogelstein},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.00325}
}
  • Richard Guo, Cencheng Shen, Joshua T. Vogelstein
  • Published 2019
  • Mathematics, Computer Science
  • ArXiv
  • Information-theoretic quantities, such as mutual information and conditional entropy, are useful statistics for measuring the dependence between two random variables. However, estimating these quantities in a non-parametric fashion is difficult, especially when the variables are high-dimensional, a mixture of continuous and discrete values, or both. In this paper, we propose a decision forest method, Conditional Forests (CF), to estimate these quantities. By combining quantile regression… CONTINUE READING

    Citations

    Publications citing this paper.

    A general approach to progressive learning

    VIEW 1 EXCERPT

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 32 REFERENCES

    Estimating Mutual Information for Discrete-Continuous Mixtures

    VIEW 8 EXCERPTS
    HIGHLY INFLUENTIAL

    Estimating mutual information.

    VIEW 2 EXCERPTS

    Ensemble Estimators for Multivariate Entropy Estimation

    Quantile Regression Forests

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Random Forests

    • Leo Breiman
    • Computer Science, Mathematics
    • Machine Learning
    • 2004
    VIEW 2 EXCERPTS