Learn More
Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when S/sub n/=/spl Sigma//sub i=1//sup n/X/sub i/ is the sum of the (possibly dependent) binary random variables X/sub 1/,X/sub 2/,...,X/sub n/, with E(X/sub i/)=p/sub i/(More)
An information-theoretic development is given for the problem of compound Pois-son approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let P Sn be the distribution of a sum S n = n i=1 Y i of independent integer-valued random variables Y i. Nonasymptotic bounds are derived for the distance between P Sn and an(More)
We consider the problem of nonadaptive noiseless group testing of N items of which K are defective. We describe four detection algorithms, the COMP algorithm of Chan et al., two new algorithms, DD and SCOMP, which require stronger evidence to declare an item defective, and an essentially optimal but computationally difficult algorithm called SSS. We(More)
Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T<inf>&#x03B1;</inf>. That is, if X and Y are independent random variables on Z<inf>+</inf> with ultra-log-concave probability mass functions, then H(T<inf>&#x03B1;</inf>X + T<inf>1-&#x03B1;</inf>Y) &#x2265;(More)
English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaus-sian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising(More)
An information theoretic perspective on group testing problems has recently been proposed by Atia and Saligrama, in order to characterise the optimal number of tests. Their results hold in the noiseless case, where only false positives occur, and where only false negatives occur. We extend their results to a model containing both false positives and false(More)
We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation(More)
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisson approximation limit theorems (analogous to the corresponding developments for the central limit theorem and for simple Poisson approximation), this work examines sufficient conditions under which the compound Poisson distribution has maximal entropy within(More)