• Publications
  • Influence
Rényi Divergence and Kullback-Leibler Divergence
TLDR
The most important properties of Rényi divergence and Kullback- Leibler divergence are reviewed, including convexity, continuity, limits of σ-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity. Expand
Entropy and the law of small numbers
TLDR
A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Expand
Refinements of Pinsker's inequality
TLDR
The main result is an exact parametrization of L which leads to Taylor polynomials which are lower bounds for L, and thereby to extensions of the classical Pinsker (1960) inequality which has numerous applications. Expand
Maximum Entropy Fundamentals
TLDR
The present paper offers a self-contained and comprehensive treatment of fundamentals of both principles, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. Expand
Properties of Classical and Quantum Jensen-Shannon Divergence
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergence measure of information theory, Kullback divergence. As opposed to Kullback divergence itExpand
Inequalities between entropy and index of coincidence derived from information diagrams
TLDR
The main result of the paper is the determination of the precise range of the map P/spl rarr/(IC(P), H(P)), which gives rise to precise lower as well as upper bounds for the entropy function. Expand
Information Topologies with Applications
Topologies related to information divergence are introduced. The conditional limit theorem is taken as motivating example, and simplified proofs of the relevant theorems are given. ContinuityExpand
Binomial and Poisson distributions as maximum entropy distributions
The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergenceExpand
AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY
In this paper, we prove that the classical Entropy Power Inequality, as derived in the continuous case, can be extended to the discrete family of binomial random variables with parameter 1=2.
Rate of convergence to Poisson law in terms of information divergence
The precise bounds on the information divergence from a binomial distribution to the accompanying Poisson law are obtained. As a corollary, an upper bound for the total variation distance between theExpand
...
1
2
3
4
5
...