Entropy estimation of symbol sequences.

  title={Entropy estimation of symbol sequences.},
  author={T. Schurmann and P. Grassberger},
  volume={6 3},
  • T. Schurmann, P. Grassberger
  • Published 1996
  • Mathematics, Physics, Computer Science, Medicine
  • Chaos
  • We discuss algorithms for estimating the Shannon entropy h of finite symbol sequences with long range correlations. In particular, we consider algorithms which estimate h from the code lengths produced by some compression algorithm. Our interest is in describing their convergence with sequence length, assuming no limits for the space and time complexities of the compression algorithms. A scaling law is proposed for extrapolation from finite sample lengths. This is applied to sequences of… CONTINUE READING
    249 Citations

    Figures, Tables, and Topics from this paper.

    Scaling behaviour of entropy estimates
    • 6
    A Note on the Shannon Entropy of Short Sequences
    Entropy Gradient : A Technique for Estimating The Entropy of Finite Time Series
    Effective normalization of complexity measurements for epoch length and sampling frequency.
    • 34
    Computing entropy rate of symbol sources & a distribution-free limit theorem
    • I. Chattopadhyay, Hod Lipson
    • Mathematics, Computer Science
    • 2014 48th Annual Conference on Information Sciences and Systems (CISS)
    • 2014
    • 4
    • PDF
    Determining the Number of Samples Required to Estimate Entropy in Natural Sequences
    • 2
    • PDF
    Regularities unseen, randomness observed: levels of entropy convergence.
    • 322
    • PDF


    Estimating the information content of symbol sequences and efficient codes
    • 108
    Compression of individual sequences via variable-rate coding
    • 3,355
    • PDF
    On the Complexity of Finite Sequences
    • 2,050
    A universal data compression system
    • 654
    • PDF