A generalization of Young's inequality for convolution with sharp constant is conjectured for scenarios where more than two functions are being convolved, and it is proven for certain parameter ranges. The conjecture would provide a unified proof of recent entropy power inequalities of Barron and Madiman, as well as of a (conjectured) generalization of the… (More)
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. For the special case of Boltzmann-Shannon entropy, this lower bound is better than that given by the entropy power inequality. Several applications are discussed, including a new proof of the classical entropy power… (More)
A new lower bound on the entropy of the sum of independent random vectors is demonstrated in terms of rearrangements. This lower bound is better than that given by the entropy power inequality. In fact, we use it to give a new, independent, and simple proof of the entropy power inequality in the case when the summands are identically distributed. We also… (More)
Paclitaxel is a diterpenoid isolated from Taxus brevifolia. It is effective for various cancers, especially ovarian and breast cancer. Due to its aqueous insolubility, it is administered dissolved in ethanol and Cremophor EL (BASF, Ludwigshafen, Germany), which can cause serious allergic reactions. In order to eliminate Cremophor EL, paclitaxel was… (More)
An elementary proof is provided of sharp bounds for the var-entropy of random vectors with log-concave densities, as well as for deviations of the information content from its mean. These bounds significantly improve on the bounds obtained by Bobkov and Madiman
A sharp uniform bound is obtained for the varentropy of the class of log-concave distributions. In particular, this yields the optimal strengthening of the equipartition property for such distributions recently proved by Bobkov and the first-named author.
A simple new lower bound is provided for the Rényi entropy of the convolution of probability distributions on the integers in terms of certain (discrete) rearrangements of these distributions. This inequality may be thought of as an entropy power inequality for integer-valued random variables.
We insert an interesting quantity involving rearrangements in between the two sides of the entropy power inequality, thereby refining it.