#### Filter Results:

#### Publication Year

2002

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when S/sub n/=/spl Sigma//sub i=1//sup n/X/sub i/ is the sum of the (possibly dependent) binary random variables X/sub 1/,X/sub 2/,...,X/sub n/, with E(X/sub i/)=p/sub i/… (More)

- Oliver Johnson
- 2004

We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L 2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher… (More)

- Oliver Johnson
- 2008

We prove that the Poisson distribution maximises entropy in the class of ultra-log-concave distributions, extending a result of Harremoës. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup. It is well-known that… (More)

An information-theoretic development is given for the problem of compound Pois-son approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let P Sn be the distribution of a sum S n = n i=1 Y i of independent integer-valued random variables Y i. Nonasymptotic bounds are derived for the distance between P Sn and an… (More)

Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T<inf>α</inf>. That is, if X and Y are independent random variables on Z<inf>+</inf> with ultra-log-concave probability mass functions, then H(T<inf>α</inf>X + T<inf>1-α</inf>Y) ≥… (More)

English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaus-sian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising… (More)

An information theoretic perspective on group testing problems has recently been proposed by Atia and Saligrama, in order to characterise the optimal number of tests. Their results hold in the noiseless case, where only false positives occur, and where only false negatives occur. We extend their results to a model containing both false positives and false… (More)

We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation… (More)

We consider the problem of non-adaptive noiseless group testing of N items of which K are defective. We describe four detection algorithms: the COMP algorithm of Chan et al.; two new algorithms, DD and SCOMP, which require stronger evidence to declare an item defective; and an essentially optimal but computationally difficult algorithm called SSS. By… (More)

Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within… (More)