• Publications
  • Influence
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
  • M. Madiman, A. Barron
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 11 May 2006
tl;dr
A simple proof of the monotonicity of information in central limit theorems for sums of independent random variables . Expand
  • 132
  • 11
  • Open Access
Information Inequalities for Joint Distributions, With Interpretations and Applications
  • M. Madiman, P. Tetali
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 31 December 2008
tl;dr
Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. Expand
  • 109
  • 10
  • Open Access
The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
  • S. Bobkov, M. Madiman
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 14 June 2010
tl;dr
The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Expand
  • 65
  • 8
  • Open Access
Beyond the Entropy Power Inequality, via Rearrangements
  • Liyao Wang, M. Madiman
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 23 July 2013
tl;dr
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. Expand
  • 61
  • 7
  • Open Access
Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information
tl;dr
The sumset and inverse sumset theories of Freiman, Plünnecke, and Ruzsa, give bounds connecting the cardinality of the sumset A + B = {a + b; a ∈ A, b ∈ B} of two discrete sets A, B, to the cardinalities (or the finer structure) of the original sets AA, B. Expand
  • 48
  • 5
  • Open Access
Optimal Concentration of Information Content For Log-Concave Densities
tl;dr
An elementary proof is provided of sharp bounds for the varentropy of random vectors with log-concave densities, as well as deviations of the information content from its mean. Expand
  • 36
  • 5
  • Open Access
Fisher Information, Compound Poisson Approximation, and the Poisson Channel
tl;dr
We introduce two new "local information quantities" to play the role of Fisher information in this context. Expand
  • 24
  • 4
  • Open Access
Fractional generalizations of Young and Brunn-Minkowski inequalities
tl;dr
A generalization of Young's inequality for convolution with sharp constant is conjectured for scenarios where more than two functions are being convolved, and it is proven for certain parameter ranges. Expand
  • 14
  • 4
  • Open Access
Forward and Reverse Entropy Power Inequalities in Convex Geometry
tl;dr
We survey various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy. Expand
  • 63
  • 3
  • Open Access
Dimensional behaviour of entropy and information
tl;dr
We develop an information-theoretic perspective on some questions in convex geometry, providing for instance a new equipartition property for log-concave probability measures, an entropic formulation of the hyperplane conjecture, and a new reverse entropy power inequality, analogous to V. Milman's reverse Brunn–Minkowski inequality. Expand
  • 36
  • 3
  • Open Access