#### Filter Results:

- Full text PDF available (52)

#### Publication Year

1946

2017

- This year (4)
- Last 5 years (23)
- Last 10 years (43)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Dino Sejdinovic, Oliver Johnson
- 2010 48th Annual Allerton Conference on…
- 2010

An information theoretic perspective on group testing problems has recently been proposed by Atia and Saligrama, in order to characterise the optimal number of tests. Their results hold in the noiseless case, where only false positives occur, and where only false negatives occur. We extend their results to a model containing both false positives and false… (More)

- Ioannis Kontoyiannis, Peter Harremoës, Oliver Johnson
- IEEE Transactions on Information Theory
- 2005

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when S/sub n/=/spl Sigma//sub i=1//sup n/X/sub i/ is the sum of the (possibly dependent) binary random variables X/sub 1/,X/sub 2/,...,X/sub n/, with E(X/sub i/)=p/sub i/… (More)

English abstract: We consider the Student-t and Student-r distributions, which maximise Rényi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Rényi maximising… (More)

- Oliver Johnson
- 2004

We give conditions for an O(1/n) rate of convergence of Fisher information and relative entropy in the Central Limit Theorem. We use the theory of projections in L2 spaces and Poincaré inequalities, to provide a better understanding of the decrease in Fisher information implied by results of Barron and Brown. We show that if the standardized Fisher… (More)

- Oliver Johnson
- 2008

We prove that the Poisson distribution maximises entropy in the class of ultralog-concave distributions, extending a result of Harremoës. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup. 1 Maximum entropy… (More)

- Matthew Aldridge, Leonardo Baldassini, Oliver Johnson
- IEEE Transactions on Information Theory
- 2014

We consider the problem of nonadaptive noiseless group testing of N items of which K are defective. We describe four detection algorithms, the COMP algorithm of Chan et al., two new algorithms, DD and SCOMP, which require stronger evidence to declare an item defective, and an essentially optimal but computationally difficult algorithm called SSS. We… (More)

- R B Barlow, O Johnson
- British journal of pharmacology
- 1989

1. Although (-)-cytisine is a rigid structure, it occurs in the crystal in two distinct but very similar conformations in which the pyridone ring is tilted relative to the charged nitrogen atom at much the same angle as the pyridine ring is in (-)-nicotine hydrogen iodide. The carbonyl group in the pyridone ring of (-)-cytisine, however, is on the side of… (More)

- Yaming Yu, Oliver Johnson
- 2009 IEEE International Symposium on Information…
- 2009

Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T<inf>α</inf>. That is, if X and Y are independent random variables on Z<inf>+</inf> with ultra-log-concave probability mass functions, then H(T<inf>α</inf>X + T<inf>1-α</inf>Y) ≥… (More)

- Peter Harremoës, Oliver Johnson, Ioannis Kontoyiannis
- 2008 IEEE International Symposium on Information…
- 2008

The law of thin numbers is a Poisson approximation theorem related to the thinning operation. We use information projections to derive lower bounds on the information divergence from a thinned distribution to a Poisson distribution. Conditions for the existence of projections are given. If an information projection exists it must be an element of the… (More)

- Oliver Johnson, Yaming Yu
- IEEE Transactions on Information Theory
- 2010

We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation… (More)