#### Filter Results:

- Full text PDF available (22)

#### Publication Year

2006

2016

- This year (0)
- Last 5 years (16)
- Last 10 years (23)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Christophe Ley, Yvik Swan
- 2012

We provide a new perspective on Stein’s so-called density approach by introducing a new operator and characterizing class which are valid for a much wider family of probability distributions on the real line. We prove an elementary factorization property of this operator and propose a new Stein identity which we use to derive information inequalities in… (More)

- Ivan Nourdin, Giovanni Peccati, Yvik Swan
- 2014 IEEE International Symposium on Information…
- 2014

We introduce a new formalism for computing expectations of functionals of arbitrary random vectors, by using generalised integration by parts formulae. In doing so we extend recent representation formulae for the score function introduced in [19] and also provide a new proof of a central identity first discovered in [7]. We derive a representation for the… (More)

Let X1, X2, . . . , Xn be independent random variables uniformly distributed on [0, 1]. We observe these sequentially and have to stop on exactly one of them. No recall of preceding observations is permitted. What stopping rule minimizes the expected rank of the selected observation? What is the value of the expected rank (as a function of n) and what is… (More)

- Benjamin Arras, Ehsan Azmoodeh, Guillaume Poly, Yvik Swan
- 2016

In the first part of the paper we use a new Fourier technique to obtain a Stein characterizations for random variables in the second Wiener chaos. We provide the connection between this result and similar conclusions that can be derived using Malliavin calculus. We also introduce a new form of discrepancy which we use, in the second part of the paper, to… (More)

- Ivan Nourdin, Giovanni Peccati, Yvik Swan
- ArXiv
- 2013

We develop a new method for bounding the relative entropy of a random vector in terms of its Stein factors. Our approach is based on a novel representation for the score function of smoothly perturbed random variables, as well as on the de Bruijn’s identity of information theory. When applied to sequences of functionals of a general Gaussian field, our… (More)

We propose a new general version of Stein’s method for univariate distributions. In particular we propose a canonical definition of the Stein operator of a probability distribution which is based on a linear difference or differential-type operator. The resulting Stein identity highlights the unifying theme behind the literature on Stein’s method (both for… (More)

- Marc HALLIN, Yvik SWAN, +5 authors David Veredas
- 2011

Classical estimation techniques for linear models either are inconsistent, or perform rather poorly, under αstable error densities; most of them are not even rate-optimal. In this paper, we propose an original one-step R-estimation method and investigate its asymptotic performances under stable densities. Contrary to traditional least squares, the proposed… (More)

- Christophe Ley, Yvik Swan
- IEEE Transactions on Information Theory
- 2013

Pinsker's inequality states that the relative entropy between two random variables X and Y dominates the square of the total variation distance between X and Y. In this paper, we introduce generalized Fisher information distances and prove that these also dominate the square of the total variation distance. To this end, we introduce a general discrete Stein… (More)

Gauss’ principle states that the maximum likelihood estimator of the parameter in a location family is the sample mean for all samples of all sample sizes if and only if the family is Gaussian. There exist many extensions of this result in diverse directions. In this paper we propose a unified treatment of this literature. In doing so we define the… (More)

- Christophe Ley, Yvik Swan
- ArXiv
- 2012