Existence of optimal prefix codes for infinite source alphabets

@article{Linder1997ExistenceOO,
  title={Existence of optimal prefix codes for infinite source alphabets},
  author={Tam{\'a}s Linder and Vahid Tarokh and Kenneth Zeger},
  journal={IEEE Trans. Inf. Theory},
  year={1997},
  volume={43},
  pages={2026-2028}
}
It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal infinite code achieves Kraft's inequality with equality. 

On the Optimal Coding

It is proven that for every random variable with a countable infinite set of outcomes and finite entropy there exists an optimal code constructed from optimal codes for truncated versions of the random variable.

Algorithms for infinite huffman-codes

The approach is to define an infinite weighted graph with the property that the least cost infinite path in the graph corresponds to the optimal code, and show that even though the graph is infinite, the leastcost infinite path has a repetitive structure and that it is therefore possible to not only find this path but to find it relatively efficiently.

Redundancy and optimality of codes for infinite-entropy sources

  • M. Klimesh
  • Computer Science
    2008 IEEE International Symposium on Information Theory
  • 2008
Three redundancy definitions and two optimality definitions are considered that turn out to be equivalent, and they are also equivalent to the 'expected codeword length minus entropy' definition when the source entropy is finite.

Existence, Uniqueness, and Optimality of Sibling-Property Codes for Infinite Sources

A Huffman-Gallager code is defined as any code that has the sibling property, and some basic facts about such codes are presented.

Optimal Prefix Codes And Huffman Codes

It is obtained that the complexity of breaking an optimal prefix code is NP-complete from the viewpoint of computational difficulty.

Optimal maximal encoding different from Huffman encoding

  • Dongyang LongW. Jia
  • Computer Science
    Proceedings International Conference on Information Technology: Coding and Computing
  • 2001
It is proved that for every random variable with a countably infinite set of outcomes and with finite entropy there exists an optimal maximal code which can be constructed from optimal maximal codes for truncated versions of the random variable.

The construction of codes for infinite sets

  • M. Laidlaw
  • Computer Science
    South Afr. Comput. J.
  • 2005
A wider class of codes is presented here, all of which are suitable for encoding the elements from an infinite set, and it is shown that each code can be described by a polynomial K(L), which determines the number of codewords of length L.

Prefix Codes for Power Laws with Countable Support

This work introduces a family of prefix codes with an eye towards near-o ptimal coding of known distributions, and one application of the nearoptimal codes is an improved representation of rational numbers.

Optimal Prefix Codes for Infinite Alphabets With Nonlinear Costs

  • M. Baer
  • Computer Science
    IEEE Transactions on Information Theory
  • 2008
Methods for finding codes optimal for beta-exponential means are introduced and one method applies to geometric distributions, while another applies to distributions with lighter tails and both are extended to alphabetic codes.

Algorithms for Infinite Huffman-Codes (Extended Abstract)

An infinite weighted graph is defined with the property that the least cost infinite path in the graph corresponds to the optimal code and it is shown that even though the graph is infinite, the leastcost infinite path has a repetitive structure and that it is therefore possible to not only find this path but to find it relatively efficiently.

References

SHOWING 1-10 OF 14 REFERENCES

On the redundancy of optimal binary prefix-condition codes for finite and infinite sources

A new lower bound is obtained for the redundancy of optimal bimuy prefix-condition (OBPC) codes for a memoryless source for which the probability of the most likely source letter is known and this bound, and upper bounds obtained by Gallager and Johnsen, hold for infinite as well as finite source alphabets.

Optimal source codes for geometrically distributed integer alphabets (Corresp.)

An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is

Huffman coding with an infinite alphabet

Two new concepts of the optimality of a prefix D-ary code are introduced, which are shown to be equivalent to that defined in the traditional way for the case where the Shannon entropy H(P) diverges.

Huffman-type codes for infinite source distributions

  • J. Abrahams
  • Computer Science
    Proceedings of IEEE Data Compression Conference (DCC'94)
  • 1994
A new sufficient condition is given for an infinite source distribution to share a minimum average codeword length code with a geometric distribution. Thus some new examples of parametric families of

Optimal source coding for a class of integer alphabets (Corresp.)

  • P. Humblet
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1978
The Huffman optimum encoding technique is extended to a class of p(i) including thosewhose tail decreases including those whose tail decreases.

A new bound for the data expansion of Huffman codes

It is proved that the maximum data expansion of Huffman codes is upper-bounded by /spl delta/<1.39, which improves on the previous best known upper bound /splDelta/<2.39.

A method for the construction of minimum-redundancy codes

  • D. Huffman
  • Computer Science, Business
    Proceedings of the IRE
  • 1952
A minimum-redundancy code is one constructed in such a way that the average number of coding digits per message is minimized.

Elements of Information Theory

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.

Principles of mathematical analysis

Chapter 1: The Real and Complex Number Systems Introduction Ordered Sets Fields The Real Field The Extended Real Number System The Complex Field Euclidean Spaces Appendix Exercises Chapter 2: Basic

Prisco is with MIT Laboratory for Computer Science

  • Prisco is with MIT Laboratory for Computer Science