Corpus ID: 218684877

Information Thresholds for Non-Parametric Structure Learning on Tree Graphical Models.

  title={Information Thresholds for Non-Parametric Structure Learning on Tree Graphical Models.},
  author={Konstantinos E. Nikolakakis and Dionysios S. Kalogerias and A. Sarwate},
  journal={arXiv: Machine Learning},
  • Konstantinos E. Nikolakakis, Dionysios S. Kalogerias, A. Sarwate
  • Published 2020
  • Mathematics, Computer Science
  • arXiv: Machine Learning
  • We provide high probability finite sample complexity guarantees for non-parametric structure learning of tree-shaped graphical models whose nodes are discrete random variables with either finite or countable alphabets, both in the noiseless and noisy regimes. We study a fundamental quantity called the (noisy) information threshold, which arises naturally from the error analysis of the Chow-Liu algorithm and, as we discuss, provides explicit necessary and sufficient conditions on sample… CONTINUE READING

    Figures from this paper.


    Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates
    • 47
    • Highly Influential
    • PDF
    Learning Tree Structures from Noisy Data
    • 4
    • PDF
    A Large-Deviation Analysis of the Maximum-Likelihood Learning of Markov Tree Structures
    • 52
    • Highly Influential
    • PDF
    Convergence properties of functional estimates for discrete distributions
    • 188
    • Highly Influential
    • PDF
    Approximating discrete probability distributions with dependence trees
    • C. Chow, C. Liu
    • Mathematics, Computer Science
    • IEEE Trans. Inf. Theory
    • 1968
    • 2,565
    • PDF
    Efficient Learning of Discrete Graphical Models
    • 14
    • PDF
    Efficiently Learning Ising Models on Arbitrary Graphs
    • 116
    • PDF
    Forest Density Estimation
    • 89
    • PDF
    Hardness of parameter estimation in graphical models
    • 28
    • PDF