#### Filter Results:

- Full text PDF available (30)

#### Publication Year

1998

2014

- This year (0)
- Last 5 years (7)
- Last 10 years (18)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

— The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities.… (More)

- Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2007

The pth moment matrix is defined for a real random vector, generalizing the classical covariance matrix. Sharp inequalities relating the pth moment and Renyi entropy are established, generalizing the classical inequality relating the second moment and the Shannon entropy. The extremal distributions for these inequalities are completely characterized

- Erwin Lutwak, Songjun Lv, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2012

We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more… (More)

In this paper we prove a sharp affine L p Sobolev inequality for functions on R n. The new inequality is significantly stronger than (and directly implies) the classical sharp L p Sobolev inequality of Aubin [A2] and Talenti [T], even though it uses only the vector space structure and standard Lebesgue measure on R n. For the new inequality, no inner… (More)

- Onur G. Guleryuz, Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Trans. Information Theory
- 2002

We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon… (More)

Corresponding to each origin-symmetric convex (or more general) subset of Euclidean n-space R n , there is a unique ellipsoid with the following property: The moment of inertia of the ellipsoid and the moment of inertia of the convex set are the same about every 1-dimensional subspace of R n. This ellipsoid is called the Legendre ellipsoid of the convex… (More)

- Erwin Lutwak, Deane Yang, Gaoyong Zhang
- IEEE Transactions on Information Theory
- 2005

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Crame/spl acute/r-Rao inequality is a direct consequence of these two… (More)