#### Filter Results:

- Full text PDF available (19)

#### Publication Year

2010

2017

- This year (1)
- Last 5 years (15)
- Last 10 years (21)

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Christophe Ley, Yvik Swan
- 2012

We provide a new perspective on Stein's so-called density approach by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line. We prove an elementary factorization property of this operator and propose a new Stein identity which we use to derive information inequalities in… (More)

- Ivan Nourdin, Giovanni Peccati, Yvik Swan
- ArXiv
- 2013

We develop a new method for bounding the relative entropy of a random vector in terms of its Stein factors. Our approach is based on a novel representation for the score function of smoothly perturbed random variables, as well as on the de Bruijn's identity of information theory. When applied to sequences of functionals of a general Gaussian field, our… (More)

Gauss' principle states that the maximum likelihood estimator of the parameter in a location family is the sample mean for all samples of all sample sizes if and only if the family is Gaussian. There exist many extensions of this result in diverse directions. In this paper we propose a unified treatment of this literature. In doing so we define the… (More)

We propose a new general version of Stein's method for uni-variate distributions. In particular we propose a canonical definition of the Stein operator of a probability distribution which is based on a linear difference or differential-type operator. The resulting Stein identity highlights the unifying theme behind the literature on Stein's method (both for… (More)

- Ivan Nourdin, Giovanni Peccati, Yvik Swan
- 2014 IEEE International Symposium on Information…
- 2014

We introduce a new formalism for computing expectations of functionals of arbitrary random vectors, by using generalised integration by parts formulae. In doing so we extend recent representation formulae for the score function introduced in [19] and also provide a new proof of a central identity first discovered in [7]. We derive a representation for the… (More)

- Marc HALLIN, Yvik SWAN, +5 authors David Veredas
- 2011

Classical estimation techniques for linear models either are inconsistent, or perform rather poorly, under α-stable error densities; most of them are not even rate-optimal. In this paper, we propose an original one-step R-estimation method and investigate its asymptotic performances under stable densities. Contrary to traditional least squares, the proposed… (More)

- Marc HALLIN, Yvik SWAN, +5 authors David Veredasa
- 2010

Linear models with stable error densities are considered. The local asymp-totic normality of the resulting model is established. We use this result, combined with Le Cam's third lemma, to obtain local powers of various classical rank tests (Wilcoxon's and van der Waerden's test, the median test, and their counterparts for regression and analysis of… (More)

- Christophe Ley, Yvik Swan
- ArXiv
- 2012

- Christophe Ley, Yvik Swan
- IEEE Transactions on Information Theory
- 2013

Pinsker's inequality states that the relative entropy between two random variables X and Y dominates the square of the total variation distance between X and Y. In this paper, we introduce generalized Fisher information distances and prove that these also dominate the square of the total variation distance. To this end, we introduce a general discrete Stein… (More)