Multivariate normal approximations by Stein's method and size bias couplings

@article{Goldstein1996MultivariateNA,
  title={Multivariate normal approximations by Stein's method and size bias couplings},
  author={Larry Goldstein and Yosef Rinott},
  journal={Journal of Applied Probability},
  year={1996},
  volume={33},
  pages={1 - 17}
}
Stein's method is used to obtain two theorems on multivariate normal approximation. Our main theorem, Theorem 1.2, provides a bound on the distance to normality for any non-negative random vector. Theorem 1.2 requires multivariate size bias coupling, which we discuss in studying the approximation of distributions of sums of dependent random vectors. In the univariate case, we briefly illustrate this approach for certain sums of nonlinear functions of multivariate normal variables. As a second… 

Stein's method for normal approximation

Stein’s method originated in 1972 in a paper in the Proceedings of the Sixth Berkeley Symposium. In that paper, he introduced the method in order to determine the accuracy of the normal approximation

Multivariate Normal Approximation by Stein's Method: The Concentration Inequality Approach

The concentration inequality approach for normal approxima- tion by Stein's method is generalized to the multivariate setting. We use this approach to prove a non-smooth function distance for

Stein couplings for normal approximation

In this article we propose a general framework for normal approximation using Stein's method. We introduce the new concept of Stein couplings and we show that it lies at the heart of popular

Stein’s method for discrete Gibbs measures

Stein's method provides a way of bounding the distance of a probability distribution to a target distribution $\mu$. Here we develop Stein's method for the class of discrete Gibbs measures with a

RATES OF MULTIVARIATE NORMAL APPROXIMATION FOR STATISTICS IN GEOMETRIC PROBABILITY

We employ stabilization methods and second order Poincaré inequalities to establish rates of multivariate normal convergence for a large class of vectors ( H (1) s ,...,H ( m ) s ) , s ≥ 1 , of

STEIN’S METHOD FOR DISCRETE GIBBS MEASURES 1

Stein’s method provides a way of bounding the distance of a probability distribution to a target distribution μ . Here we develop Stein’s method for the class of discrete Gibbs measures with a

Rates of multivariate normal approximation for statistics in geometric probability

We employ stabilization methods and second order Poincaré inequalities to establish rates of multivariate normal convergence for a large class of vectors (H (1) s , . . . , H (m) s ), s ≥ 1, of

On Stein's method for products of normal random variables and zero bias couplings

In this paper we extend Stein's method to the distribution of the product of $n$ independent mean zero normal random variables. A Stein equation is obtained for this class of distributions, which

Stein's method for Conditional Central Limit Theorem

In the seventies, Charles Stein revolutionized the way of proving the Central Limit Theorem by introducing a method that utilizes a characterization equation for Gaussian distribution. In the last

Stein's density method for multivariate continuous distributions

This paper provides a general framework for Stein’s density method for multivariate continuous distributions. The approach associates to any probability density function a canonical operator and
...

References

SHOWING 1-10 OF 31 REFERENCES

A bound for the error in the normal approximation to the distribution of a sum of dependent random variables

This paper has two aims, one fairly concrete and the other more abstract. In Section 3, bounds are obtained under certain conditions for the departure of the distribution of the sum of n terms of a

A Weak Law of Large Numbers for Empirical Measures via Stein's Method

Let E be a locally compact Hausdorff space with countable basis and let (X i ) i ∈ N be a family of random elements on E with (1/n) Σ i=1 n L(X i ) ⇒ H(n → ∞) for a measure μ with ∥μ∥ ≤ 1. Conditions

Poisson convergence and semi-induced properties of random graphs

Barbour [l] invented an ingenious method of establishing the asymptotic distribution of the number X of specified subgraphs of a random graph. The novelty of his method relies on using the first two

Stein's method for diffusion approximations

SummaryStein's method of obtaining distributional approximations is developed in the context of functional approximation by the Wiener process and other Gaussian processes. An appropriate analogue of

The asymptotic distributions of generalized U-statistics with applications to random graphs

SummaryWe consider the random variable $$S_{n,v} (f) = \sum\limits_{i_1< ...< i_v \leqq n} {f(X_{i_1 } ,...,X_{i_v } ,Y_{i_1 i_2 } ,...,Y_{i_{v - 1} i_v } ),}$$ where {Xi}i=1/n and {Yij}1≦i<i≦n are

On Normal Approximations of Distributions in Terms of Dependency Graphs

L'auteur interprete les bornes de l'erreur, introduites par Stein, dans l'approximation normale de sommes de variables aleatoires dependantes en termes de graphes de dependance. Ceci mene a des

Approximate computation of expectations

On multivariate normal approximations by Stein’s method and size bias couplings: Technical Report

  • 1994