—We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more… (More)
— The p-th moment matrix is defined for a real random vector, generalizing the classical covariance matrix. Sharp inequalities relating the p-th moment and Renyi entropy are established, generalizing the classical inequality relating the second moment and the Shannon entropy. The extremal distributions for these inequalities are completely characterized. I.… (More)
We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon… (More)
—An affine invariant p-th moment measure is defined for a random vector and used to prove sharp moment-entropy inequalities that are more general and stronger than standard moment-entropy inequalities.
Two new approaches are presented to establish the existence of polytopal solutions to the discrete-data L p Minkowski problem for all p > 1. As observed by Schneider , the Brunn-Minkowski theory springs from joining the notion of ordinary volume in Euclidean d-space, R d , with that of Minkowski combinations of convex bodies. One of the cornerstones of… (More)
—A unified approach is presented for establishing a broad class of Cramér-Rao inequalities for the location parameter , including, as special cases, the original inequality of Cramér and Rao, as well as an L p version recently established by the authors. The new approach allows for generalized moments and Fisher information measures to be defined by convex… (More)