#### Filter Results:

#### Publication Year

1998

2014

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

— The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities.… (More)

- Erwin Lutwak, Songjun Lv, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2012

We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more… (More)

In this paper we prove a sharp affine L p Sobolev inequality for functions on R n. The new inequality is significantly stronger than (and directly implies) the classical sharp L p Sobolev inequality of Aubin [A2] and Talenti [T], even though it uses only the vector space structure and standard Lebesgue measure on R n. For the new inequality, no inner… (More)

- Onur G. Guleryuz, Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Trans. Information Theory
- 2002

We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon… (More)

- Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2007

The pth moment matrix is defined for a real random vector, generalizing the classical covariance matrix. Sharp inequalities relating the pth moment and Renyi entropy are established, generalizing the classical inequality relating the second moment and the Shannon entropy. The extremal distributions for these inequalities are completely characterized

- Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2005

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Crame/spl acute/r-Rao inequality is a direct consequence of these two… (More)

A purely analytic proof is given for an inequality that has as a direct consequence the two most important affine isoperimetric inequalities of plane convex geometry: The Blaschke-Santalo inequality and the affine isoperimetric inequality of affine differential geometry.

Corresponding to each origin-symmetric convex (or more general) subset of Euclidean n-space R n , there is a unique ellipsoid with the following property: The moment of inertia of the ellipsoid and the moment of inertia of the convex set are the same about every 1-dimensional subspace of R n. This ellipsoid is called the Legendre ellipsoid of the convex… (More)

Associated with each body K in Euclidean n-space R n is an ellipsoid 2 K called the Legendre ellipsoid of K. It can be defined as the unique ellipsoid centered at the body's center of mass such that the ellipsoid's moment of inertia about any axis passing through the center of mass is the same as that of the body. In an earlier paper the authors showed that… (More)