#### Filter Results:

#### Publication Year

2010

2015

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. For the special case of Boltzmann-Shannon entropy, this lower bound is better than that given by the entropy power inequality. Several applications are discussed, including a new proof of the classical entropy power inequality… (More)

Paclitaxel is a diterpenoid isolated from Taxus brevifolia. It is effective for various cancers, especially ovarian and breast cancer. Due to its aqueous insolubility, it is administered dissolved in ethanol and Cremophor EL (BASF, Ludwigshafen, Germany), which can cause serious allergic reactions. In order to eliminate Cremophor EL, paclitaxel was… (More)

—A new lower bound on the entropy of the sum of independent random vectors is demonstrated in terms of rearrangements. This lower bound is better than that given by the entropy power inequality. In fact, we use it to give a new, independent, and simple proof of the entropy power inequality in the case when the summands are identically distributed. We also… (More)

A generalization of Young's inequality for convolution with sharp constant is conjectured for scenarios where more than two functions are being convolved, and it is proven for certain parameter ranges. The conjecture would provide a unified proof of recent entropy power inequalities of Barron and Madiman, as well as of a (conjectured) generalization of the… (More)

An elementary proof is provided of sharp bounds for the var-entropy of random vectors with log-concave densities, as well as for deviations of the information content from its mean. These bounds significantly improve on the bounds obtained by Bobkov and Madiman

- Mokshay Madiman, Liyao Wang, Then
- 2013

—We insert an interesting quantity involving rearrangements in between the two sides of the entropy power inequality, thereby refining it. The entropy power inequality (EPI) is a basic and powerful tool in information theory, and also has relevance to probability theory and mathematical physics. Applications include converse parts of various coding… (More)

- ‹
- 1
- ›