Learn More
— The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities.(More)
—We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam's inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more(More)
In this paper we prove a sharp affine L p Sobolev inequality for functions on R n. The new inequality is significantly stronger than (and directly implies) the classical sharp L p Sobolev inequality of Aubin [A2] and Talenti [T], even though it uses only the vector space structure and standard Lebesgue measure on R n. For the new inequality, no inner(More)
We show that for a special class of probability distributions that we call contoured distributions, information theoretic invariants and inequalities are equivalent to geometric invariants and inequalities of bodies in Euclidean space associated with the distributions. Using this, we obtain characterizations of contoured distributions with extremal Shannon(More)
The dominant consideration in the valuation of mortgage-backed securities (MBS) is modeling the prepayments of the pool of underlying mortgages. Current industry practice is to use historical data to project future prepayments. In this paper we introduce a new approach and show how it can be used to value both pools of mortgages and mortgage-backed(More)