Universal estimation of entropy and divergence via block sorting

@article{Cai2002UniversalEO,
  title={Universal estimation of entropy and divergence via block sorting},
  author={Haixiao Cai and S. Kulkarni and Sergio Verd{\'u}},
  journal={Proceedings IEEE International Symposium on Information Theory,},
  year={2002},
  pages={433-}
}
In this paper, we present a new algorithm to estimate both entropy and divergence of two finite-alphabet, finite-memory tree sources, using only information provided by a realization from each of the two sources. Our algorithm outperforms a previous LZ-based method. It is motivated by data compression based on the Burrows-Wheeler block sorting transform, using the fact that if the input is a finite-memory tree source, then the divergence between the output distribution and a piecewise… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-7 OF 7 CITATIONS

Algorithms for estimating information distance with application to bioinformatics and linguistics

  • Canadian Conference on Electrical and Computer Engineering 2004 (IEEE Cat. No.04CH37513)
  • 2004
VIEW 4 EXCERPTS
CITES RESULTS, BACKGROUND & METHODS
HIGHLY INFLUENCED

Universal Divergence Estimation for Finite-Alphabet Sources

  • IEEE Transactions on Information Theory
  • 2006
VIEW 1 EXCERPT
CITES METHODS

Divergence estimation of continuous distributions based on data-dependent partitions

  • IEEE Transactions on Information Theory
  • 2005
VIEW 2 EXCERPTS
CITES BACKGROUND & METHODS

Universal estimation of divergence for continuous distributions via data-dependent partitions

  • Proceedings. International Symposium on Information Theory, 2005. ISIT 2005.
  • 2005
VIEW 2 EXCERPTS
CITES METHODS & BACKGROUND

Universal estimation of information measures

  • IEEE Information Theory Workshop, 2005.
  • 2005

References

Publications referenced by this paper.
SHOWING 1-3 OF 3 REFERENCES

Universal lossless source coding with the Burrows Wheeler transform

  • Proceedings DCC'99 Data Compression Conference (Cat. No. PR00096)
  • 1999

A measure of relative entropy between individual sequences with application to universal classification

N. Merhav J. Ziv
  • IEEE Trans . Inform . Theory