Corpus ID: 5483189

Shannon Information and Kolmogorov Complexity

@article{Grnwald2004ShannonIA,
  title={Shannon Information and Kolmogorov Complexity},
  author={P. Gr{\"u}nwald and P. Vit{\'a}nyi},
  journal={ArXiv},
  year={2004},
  volume={cs.IT/0410002}
}
  • P. Grünwald, P. Vitányi
  • Published 2004
  • Mathematics, Computer Science
  • ArXiv
  • We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (`algorithmic') mutual information, probabilistic sufficient statistic versus algorithmic sufficient statistic (related to lossy… CONTINUE READING

    Figures and Topics from this paper.

    Entropy Measures vs. Kolmogorov Complexity
    • 28
    • PDF
    Algorithmic information theory
    • 26
    • PDF
    Algorithmic information theory
    • 227
    • PDF
    Meaningful Information
    • 66
    • PDF
    Algorithmic Information Theory
    • 8
    • PDF
    Algorithmic Information Theory
    • 3
    • PDF

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 32 REFERENCES
    Elements of Information Theory
    • 38,850
    • PDF
    A Mathematical Theory of Communication
    • 43,152
    • PDF
    An Introduction to Kolmogorov Complexity and Its Applications
    • 3,531
    • PDF
    Algorithmic statistics
    • 128
    • PDF
    A Formal Theory of Inductive Inference. Part II
    • 1,561
    • PDF
    Kolmogorov's structure functions and model selection
    • 147
    • PDF