Approximation of Projections of Random Vectors

@article{Meckes2009ApproximationOP,
  title={Approximation of Projections of Random Vectors},
  author={Elizabeth S. Meckes},
  journal={Journal of Theoretical Probability},
  year={2009},
  volume={25},
  pages={333-352}
}
Let X be a d-dimensional random vector and Xθ its projection onto the span of a set of orthonormal vectors {θ1,…,θk}. Conditions on the distribution of X are given such that if θ is chosen according to Haar measure on the Stiefel manifold, the bounded-Lipschitz distance from Xθ to a Gaussian distribution is concentrated at its expectation; furthermore, an explicit bound is given for the expected distance, in terms of d, k, and the distribution of X, allowing consideration not just of fixed k… 
LARGE DEVIATIONS FOR RANDOM PROJECTIONS OF `p BALLS
Let p∈ [1,∞]. Consider the projection of a uniform random vector from a suitably normalized `p ball in Rn onto an independent random vector from the unit sphere. We show that sequences of such random
Conditional central limit theorems for Gaussian projections
  • G. Reeves
  • Computer Science, Mathematics
    2017 IEEE International Symposium on Information Theory (ISIT)
  • 2017
TLDR
This paper addresses the question of when projections of a high-dimensional random vector are approximately Gaussian and results are bounds on the deviation between the conditional distribution of the projections and a Gaussian approximation, where the conditioning is on the projection matrix.
GEOMETRIC SHARP LARGE DEVIATIONS FOR RANDOM PROJECTIONS OF `p SPHERES
Accurate estimation of tail probabilities of projections of high-dimensional probability measures is of relevance in high-dimensional statistics, asymptotic geometric analysis and computer science.
Projections of Probability Distributions: A Measure-Theoretic Dvoretzky Theorem
Many authors have studied the phenomenon of typically Gaussian marginals of high-dimensional random vectors; e.g., for a probability measure on \({\mathbb{R}}^{d}\), under mild conditions, most
Change-Point Detection of the Mean Vector with Fewer Observations than the Dimension Using Instantaneous Normal Random Projections
Our aim in this paper is to propose a simple method of a change-point detection of mean vector when the number of samples (historical data set) is smaller than the dimension. We restrict here our
Fast Approximation of the Sliced-Wasserstein Distance Using Concentration of Random Projections
TLDR
This work adopts a new perspective to approximate SW by making use of the concentration of measure phenomenon, and develops a simple deterministic approximation that is both accurate and easy to use compared to the usual Monte Carlo approximation.
Large deviations for random projections of $\ell^p$ balls
Let $p\in[1,\infty]$. Consider the projection of a uniform random vector from a suitably normalized $\ell^p$ ball in $\mathbb{R}^n$ onto an independent random vector from the unit sphere. We show
High-dimensional limit theorems for random vectors in ℓpn-balls. II
In this paper, we prove three fundamental types of limit theorems for the [Formula: see text]-norm of random vectors chosen at random in an [Formula: see text]-ball in high dimensions. We obtain a
Stability of Random-Projection Based Classifiers. The Bayes Error Perspective
TLDR
Relatively low variance of the Bayes error introduced by random projections confirms the stability of the random-projection based classifiers, at least under the proposed assumptions.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
The Sizes of Compact Subsets of Hilbert Space and Continuity of Gaussian Processes
The first two sections of this paper are introductory and correspond to the two halves of the title. As is well known, there is no complete analog of Lebesue or Haar measure in an
Quantitative asymptotics of graphical projection pursuit
There is a result of Diaconis and Freedman which says that, in a limiting sense, for large collections of high-dimensional data most one-dimensional projections of the data are approximately
Sudakov's typical marginals, random linear functionals and a conditional central limit theorem
V.N. Sudakov [Sud78] proved that the one-dimensional marginals of a high-dimensional second order measure are close to each other in most directions. Extending this and a related result in the
Asymptotic Theory Of Finite Dimensional Normed Spaces
The Concentration of Measure Phenomenon in the Theory of Normed Spaces.- Preliminaries.- The Isoperimetric Inequality on Sn?1 and Some Consequences.- Finite Dimensional Normed Spaces, Preliminaries.-
Poisson Approximation for Dependent Trials
by a Poisson distribution and a derivation of a bound on the distance between the distribution of W and the Poisson distribution with mean E(W ). This new method is based on previous work by C. Stein
On Stein's method for multivariate normal approximation
The purpose of this paper is to synthesize the approaches taken by Chatterjee-Meckes and Reinert-R\"ollin in adapting Stein's method of exchangeable pairs for multivariate normal approximation. The
MULTIVARIATE NORMAL APPROXIMATION USING EXCHANGEABLE PAIRS
Since the introduction of Stein's method in the early 1970s, much research has been done in extending and strengthening it; however, there does not exist a version of Stein's original method of
Two moments su ce for Poisson approx-imations: the Chen-Stein method
Convergence to the Poisson distribution, for the number of occurrences of dependent events, can often be established by computing only first and second moments, but not higher ones. This remarkable
Poisson Approximation and the Chen-Stein Method
The Chen-Stein method of Poisson approximation is a powerful tool for computing an error bound when approximating probabilities using the Poisson distribution. In many cases, this bound may be given
Stein's Method: Expository Lectures and Applications
TLDR
A review of Stein’s method applied to the case of discrete random variables and attempt to complete one of Stein's open problems, that of providing a discrete version for chapter 6 of his book.
...
...