Oliver Johnson

Learn More
An information theoretic perspective on group testing problems has recently been proposed by Atia and Saligrama, in order to characterise the optimal number of tests. Their results hold in the noiseless case, where only false positives occur, and where only false negatives occur. We extend their results to a model containing both false positives and false(More)
Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when S/sub n/=/spl Sigma//sub i=1//sup n/X/sub i/ is the sum of the (possibly dependent) binary random variables X/sub 1/,X/sub 2/,...,X/sub n/, with E(X/sub i/)=p/sub i/(More)
We consider the problem of nonadaptive noiseless group testing of N items of which K are defective. We describe four detection algorithms, the COMP algorithm of Chan et al., two new algorithms, DD and SCOMP, which require stronger evidence to declare an item defective, and an essentially optimal but computationally difficult algorithm called SSS. We(More)
1. Although (-)-cytisine is a rigid structure, it occurs in the crystal in two distinct but very similar conformations in which the pyridone ring is tilted relative to the charged nitrogen atom at much the same angle as the pyridine ring is in (-)-nicotine hydrogen iodide. The carbonyl group in the pyridone ring of (-)-cytisine, however, is on the side of(More)
Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T<inf>&#x03B1;</inf>. That is, if X and Y are independent random variables on Z<inf>+</inf> with ultra-log-concave probability mass functions, then H(T<inf>&#x03B1;</inf>X + T<inf>1-&#x03B1;</inf>Y) &#x2265;(More)
The law of thin numbers is a Poisson approximation theorem related to the thinning operation. We use information projections to derive lower bounds on the information divergence from a thinned distribution to a Poisson distribution. Conditions for the existence of projections are given. If an information projection exists it must be an element of the(More)
We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation(More)