Learn More
Pinsker's inequality states that the relative entropy d KL (X, Y) between two random variables X and Y dominates the square of the total variation distance d TV (X, Y) between X and Y. In this paper we introduce generalized Fisher information distances J (X, Y) between discrete distributions X and Y and prove that these also dominate the square of the total(More)
—We introduce a new formalism for computing expectations of functionals of arbitrary random vectors, by using generalised integration by parts formulae. In doing so we extend recent representation formulae for the score function introduced in [19] and also provide a new proof of a central identity first discovered in [7]. We derive a representation for the(More)
In this paper, we introduce new Stein identities for gamma target distribution as well as a new non-linear channel specifically designed for gamma inputs. From these two ingredients, we derive an explicit and simple formula for the derivative of the input-output mutual information of this non-linear channel with respect to the channel quality parameter.(More)
Let X = {X n } n≥1 and Y = {Y n } n≥1 be two independent random sequences. We obtain rates of convergence to the normal law of randomly weighted self-normalized sums These rates are seen to hold for the convergence of a number of important statistics, such as for instance Student's t-statistic or the empirical correlation coefficient.
  • 1