On empirical cumulant generating functions of code lengths for individual sequences

@article{Merhav2017OnEC,
  title={On empirical cumulant generating functions of code lengths for individual sequences},
  author={Neri Merhav},
  journal={2017 IEEE International Symposium on Information Theory (ISIT)},
  year={2017},
  pages={1500-1504}
}
  • N. Merhav
  • Published 4 May 2016
  • Computer Science
  • 2017 IEEE International Symposium on Information Theory (ISIT)
We consider the problem of lossless compression of individual sequences using finite-state (FS) machines, from the perspective of the best achievable empirical cumulant generating function (CGF) of the code length, i.e., the normalized logarithm of the empirical average of the exponentiated code length. Since the probabilistic CGF is minimized in terms of the Renyi entropy of the source, one of the motivations of this study is to derive an individual-sequence analogue of the Renyi entropy, in… 
Improved Bounds on Lossless Source Coding and Guessing Moments via Rényi Measures
TLDR
Upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available are provided, similar to Arikan’s bounds, but expressed in terms of the Arimoto-Rényi conditional entropy.
Fading Channel Coding Based on Entropy and Compressive Sensing
TLDR
Channel code length is investigated under Rayleigh and Rician fading assumptions along with additive noise consideration and the inverse problem of identifying the corresponding distributions from the derived channel code lengths and Compressive Sensing based number of samples is addressed.

References

SHOWING 1-10 OF 13 REFERENCES
Compression of individual sequences via variable-rate coding
TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Buffer overflow in variable length coding of fixed rate sources
  • F. Jelinek
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1968
In this paper, we develop and analyze an easily instrumentable scheme for variable length encoding of discrete memoryless fixed-rate sources in which buffer overflows result in codeword erasures at
Universal coding with minimum probability of codeword length overflow
  • N. Merhav
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1991
TLDR
It is shown that the Lempel-Ziv algorithm (1978) asymptotically attains the optimal performance in the sense just defined, independently of the source and the value of B, and faster convergence to theAsymptotic optimum performance can be accomplished by using the minimum-description-length universal code for this subclass.
A Coding Theorem and Rényi's Entropy
The optimal overflow and underflow probabilities with variable-length coding for the general source
  • O. UchidaT. Han
  • Computer Science
    2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060)
  • 2000
TLDR
This study shows that the infimum achievable threshold given the overflow probability exponent r always coincides with the inf optimum achievable fixed-length coding rate given the error exponent r, without any assumptions on the source.
An inequality on guessing and its application to sequential decoding
  • E. Arıkan
  • Computer Science
    Proceedings of 1995 IEEE International Symposium on Information Theory
  • 1995
TLDR
A simple derivation of the cutoff rate bound for a single-user channels is obtained, and the previously unknown cutoff rate region of multi-access channels is determined.
Generalization of Huffman coding to minimize the probability of buffer overflow
  • P. Humblet
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1981
An algorithm is given to find a prefix condition code that minimizes the value of the moment generating function of the codeword length distribution for a given positive argument. This algorithm is
Universal decoding for finite-state channels
  • J. Ziv
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1985
Universal decoding procedures for finite-state channels are discussed. Although the channel statistics are not known, universal decoding can achieve an error probability with an error exponent that,
Universal Prediction
TLDR
Both the probabilistic setting and the deterministic setting of the universal prediction problem are described with emphasis on the analogy and the differences between results in the two settings.
On optimum strategies for minimizing the exponential moments of a loss function
  • N. Merhav
  • Mathematics, Computer Science
    2012 IEEE International Symposium on Information Theory Proceedings
  • 2012
We consider a general problem of minimizing the exponential moment of a given loss function, with an emphasis on the relation to the more common criterion of minimization the first moment of the same
...
...