Author pages are created from data sourced from our academic publisher partnerships and public sources.

Publications Influence

Share This Author

Rényi Divergence and Kullback-Leibler Divergence

- T. Erven, P. Harremoës
- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 12 June 2012

TLDR

Entropy and the law of small numbers

- Ioannis Kontoyiannis, P. Harremoës, O. Johnson
- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 1 November 2002

TLDR

Refinements of Pinsker's inequality

- A. A. Fedotov, P. Harremoës, F. Topsøe
- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1 June 2003

TLDR

Maximum Entropy Fundamentals

- P. Harremoës, F. Topsøe
- Computer Science
- Entropy
- 30 September 2001

TLDR

Properties of Classical and Quantum Jensen-Shannon Divergence

- J. Briët, P. Harremoës
- Physics
- 1 May 2009

Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergence measure of information theory, Kullback divergence. As opposed to Kullback divergence it… Expand

Inequalities between entropy and index of coincidence derived from information diagrams

- P. Harremoës, F. Topsøe
- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1 November 2001

TLDR

Information Topologies with Applications

- P. Harremoës
- Mathematics
- 2007

Topologies related to information divergence are introduced. The conditional limit theorem is taken as motivating example, and simplified proofs of the relevant theorems are given. Continuity… Expand

Binomial and Poisson distributions as maximum entropy distributions

- P. Harremoës
- Computer Science
- IEEE Trans. Inf. Theory
- 1 July 2001

The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence… Expand

AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY

- P. Harremoës, C. Vignat
- Mathematics
- 2003

In this paper, we prove that the classical Entropy Power Inequality, as derived in the continuous case, can be extended to the discrete family of binomial random variables with parameter 1=2.

Rate of convergence to Poisson law in terms of information divergence

- P. Harremoës, P. Ruzankin
- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 1 September 2004

The precise bounds on the information divergence from a binomial distribution to the accompanying Poisson law are obtained. As a corollary, an upper bound for the total variation distance between the… Expand

...

1

2

3

4

5

...