Multivariate normal approximations by Stein's method and size bias couplings

@article{Goldstein1996MultivariateNA,
  title={Multivariate normal approximations by Stein's method and size bias couplings},
  author={Larry Goldstein and Yosef Rinott},
  journal={Journal of Applied Probability},
  year={1996},
  volume={33},
  pages={1 - 17}
}
Stein's method is used to obtain two theorems on multivariate normal approximation. Our main theorem, Theorem 1.2, provides a bound on the distance to normality for any non-negative random vector. Theorem 1.2 requires multivariate size bias coupling, which we discuss in studying the approximation of distributions of sums of dependent random vectors. In the univariate case, we briefly illustrate this approach for certain sums of nonlinear functions of multivariate normal variables. As a second… 

Improved bounds in Stein's method for functions of multivariate normal random vectors

In a recent paper, Gaunt [17] extended Stein’s method to limit distributions that can be represented as a function g : R d → R of a centered multivariate normal random vector Σ 1 / 2 Z with Z a

Stein's method for normal approximation

Stein’s method originated in 1972 in a paper in the Proceedings of the Sixth Berkeley Symposium. In that paper, he introduced the method in order to determine the accuracy of the normal approximation

Multivariate Normal Approximation by Stein's Method: The Concentration Inequality Approach

The concentration inequality approach for normal approxima- tion by Stein's method is generalized to the multivariate setting. We use this approach to prove a non-smooth function distance for

Stein couplings for normal approximation

In this article we propose a general framework for normal approximation using Stein's method. We introduce the new concept of Stein couplings and we show that it lies at the heart of popular

Stein’s method for discrete Gibbs measures

Stein's method provides a way of bounding the distance of a probability distribution to a target distribution $\mu$. Here we develop Stein's method for the class of discrete Gibbs measures with a

RATES OF MULTIVARIATE NORMAL APPROXIMATION FOR STATISTICS IN GEOMETRIC PROBABILITY

We employ stabilization methods and second order Poincaré inequalities to establish rates of multivariate normal convergence for a large class of vectors ( H (1) s ,...,H ( m ) s ) , s ≥ 1 , of

STEIN’S METHOD FOR DISCRETE GIBBS MEASURES 1

Stein’s method provides a way of bounding the distance of a probability distribution to a target distribution μ . Here we develop Stein’s method for the class of discrete Gibbs measures with a

Rates of multivariate normal approximation for statistics in geometric probability

We employ stabilization methods and second order Poincaré inequalities to establish rates of multivariate normal convergence for a large class of vectors (H (1) s , . . . , H (m) s ), s ≥ 1, of

On Stein's method for products of normal random variables and zero bias couplings

In this paper we extend Stein's method to the distribution of the product of $n$ independent mean zero normal random variables. A Stein equation is obtained for this class of distributions, which

Stein's method for comparison of univariate distributions

We propose a new general version of Stein's method for univariate distributions. In particular we propose a canonical definition of the Stein operator of a probability distribution {which is based on
...

References

SHOWING 1-10 OF 31 REFERENCES

A bound for the error in the normal approximation to the distribution of a sum of dependent random variables

This paper has two aims, one fairly concrete and the other more abstract. In Section 3, bounds are obtained under certain conditions for the departure of the distribution of the sum of n terms of a

A Weak Law of Large Numbers for Empirical Measures via Stein's Method

Let E be a locally compact Hausdorff space with countable basis and let (X i ) i ∈ N be a family of random elements on E with (1/n) Σ i=1 n L(X i ) ⇒ H(n → ∞) for a measure μ with ∥μ∥ ≤ 1. Conditions

Poisson convergence and semi-induced properties of random graphs

Barbour [l] invented an ingenious method of establishing the asymptotic distribution of the number X of specified subgraphs of a random graph. The novelty of his method relies on using the first two

Some Examples of Normal Approximations by Stein’s Method

Stein’s method is applied to study the rate of convergence in the normal approximation for sums of non-linear functionals of correlated Gaussian random variables, for the exceedances of r-scans of

Stein's method for diffusion approximations

SummaryStein's method of obtaining distributional approximations is developed in the context of functional approximation by the Wiener process and other Gaussian processes. An appropriate analogue of

The asymptotic distributions of generalized U-statistics with applications to random graphs

SummaryWe consider the random variable $$S_{n,v} (f) = \sum\limits_{i_1< ...< i_v \leqq n} {f(X_{i_1 } ,...,X_{i_v } ,Y_{i_1 i_2 } ,...,Y_{i_{v - 1} i_v } ),}$$ where {Xi}i=1/n and {Yij}1≦i<i≦n are

On the number of vertices of given degree in a random graph

Some values of the edge probability p for which the number of vertices of a given degree of a random graph G ∈ (n, p) asymptotically has a normal distribution are determined.

Approximate computation of expectations