# Existence of optimal prefix codes for infinite source alphabets

@article{Linder1997ExistenceOO, title={Existence of optimal prefix codes for infinite source alphabets}, author={Tam{\'a}s Linder and Vahid Tarokh and Kenneth Zeger}, journal={IEEE Trans. Inf. Theory}, year={1997}, volume={43}, pages={2026-2028} }

It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal infinite code achieves Kraft's inequality with equality.

## 50 Citations

### On the Optimal Coding

- Computer ScienceIEEE Pacific Rim Conference on Multimedia
- 2001

It is proven that for every random variable with a countable infinite set of outcomes and finite entropy there exists an optimal code constructed from optimal codes for truncated versions of the random variable.

### Algorithms for infinite huffman-codes

- Computer ScienceSODA '04
- 2004

The approach is to define an infinite weighted graph with the property that the least cost infinite path in the graph corresponds to the optimal code, and show that even though the graph is infinite, the leastcost infinite path has a repetitive structure and that it is therefore possible to not only find this path but to find it relatively efficiently.

### Redundancy and optimality of codes for infinite-entropy sources

- Computer Science2008 IEEE International Symposium on Information Theory
- 2008

Three redundancy definitions and two optimality definitions are considered that turn out to be equivalent, and they are also equivalent to the 'expected codeword length minus entropy' definition when the source entropy is finite.

### Existence, Uniqueness, and Optimality of Sibling-Property Codes for Infinite Sources

- Computer Science2006 IEEE International Symposium on Information Theory
- 2006

A Huffman-Gallager code is defined as any code that has the sibling property, and some basic facts about such codes are presented.

### Optimal Prefix Codes And Huffman Codes

- Computer ScienceInt. J. Comput. Math.
- 2003

It is obtained that the complexity of breaking an optimal prefix code is NP-complete from the viewpoint of computational difficulty.

### Optimal maximal encoding different from Huffman encoding

- Computer ScienceProceedings International Conference on Information Technology: Coding and Computing
- 2001

It is proved that for every random variable with a countably infinite set of outcomes and with finite entropy there exists an optimal maximal code which can be constructed from optimal maximal codes for truncated versions of the random variable.

### The construction of codes for infinite sets

- Computer ScienceSouth Afr. Comput. J.
- 2005

A wider class of codes is presented here, all of which are suitable for encoding the elements from an infinite set, and it is shown that each code can be described by a polynomial K(L), which determines the number of codewords of length L.

### Prefix Codes for Power Laws with Countable Support

- Computer ScienceArXiv
- 2006

This work introduces a family of prefix codes with an eye towards near-o ptimal coding of known distributions, and one application of the nearoptimal codes is an improved representation of rational numbers.

### Algorithms for Infinite Huffman-Codes (Extended Abstract)

- Computer Science
- 2003

An infinite weighted graph is defined with the property that the least cost infinite path in the graph corresponds to the optimal code and it is shown that even though the graph is infinite, the leastcost infinite path has a repetitive structure and that it is therefore possible to not only find this path but to find it relatively efficiently.

### Prefix codes for power laws

- Computer Science2008 IEEE International Symposium on Information Theory
- 2008

A family of prefix codes with an eye towards near-optimal coding of known distributions is introduced, precisely estimating compression performance for well-known probability distributions using these new codes and using previously known prefix codes.

## References

SHOWING 1-10 OF 14 REFERENCES

### On the redundancy of optimal binary prefix-condition codes for finite and infinite sources

- Computer ScienceIEEE Trans. Inf. Theory
- 1987

A new lower bound is obtained for the redundancy of optimal bimuy prefix-condition (OBPC) codes for a memoryless source for which the probability of the most likely source letter is known and this bound, and upper bounds obtained by Gallager and Johnsen, hold for infinite as well as finite source alphabets.

### Optimal source codes for geometrically distributed integer alphabets (Corresp.)

- Computer ScienceIEEE Trans. Inf. Theory
- 1975

An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is…

### Huffman coding with an infinite alphabet

- Computer ScienceIEEE Trans. Inf. Theory
- 1996

Two new concepts of the optimality of a prefix D-ary code are introduced, which are shown to be equivalent to that defined in the traditional way for the case where the Shannon entropy H(P) diverges.

### Huffman-type codes for infinite source distributions

- Computer ScienceProceedings of IEEE Data Compression Conference (DCC'94)
- 1994

A new sufficient condition is given for an infinite source distribution to share a minimum average codeword length code with a geometric distribution. Thus some new examples of parametric families of…

### Optimal source coding for a class of integer alphabets (Corresp.)

- Computer ScienceIEEE Trans. Inf. Theory
- 1978

The Huffman optimum encoding technique is extended to a class of p(i) including thosewhose tail decreases including those whose tail decreases.

### A new bound for the data expansion of Huffman codes

- Computer ScienceIEEE Trans. Inf. Theory
- 1997

It is proved that the maximum data expansion of Huffman codes is upper-bounded by /spl delta/<1.39, which improves on the previous best known upper bound /splDelta/<2.39.

### A method for the construction of minimum-redundancy codes

- Computer Science, BusinessProceedings of the IRE
- 1952

A minimum-redundancy code is one constructed in such a way that the average number of coding digits per message is minimized.

### Elements of Information Theory

- Computer Science
- 1991

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.

### Principles of mathematical analysis

- Mathematics
- 1964

Chapter 1: The Real and Complex Number Systems Introduction Ordered Sets Fields The Real Field The Extended Real Number System The Complex Field Euclidean Spaces Appendix Exercises Chapter 2: Basic…

### Prisco is with MIT Laboratory for Computer Science

- Prisco is with MIT Laboratory for Computer Science