Corpus ID: 11392028

Entropy and Huffman Codes

  title={Entropy and Huffman Codes},
  • Published 2012
We will show that • the entropy for a random variable gives a lower bound on the number of bits needed per character for a binary coding • Huffman codes are optimal in the average number of bits used per character among binary codes • the average bits per character used by Huffman codes is close to the entropy of the underlying random variable • one can get arbitrarily close to the entropy of a random variable in terms of bits per character if blocks of several characters are encoded at a time… Expand