Concentration inequalities and moment bounds for sample covariance operators

@article{Koltchinskii2014ConcentrationIA,
  title={Concentration inequalities and moment bounds for sample covariance operators},
  author={Vladimir Koltchinskii and Karim Lounici},
  journal={Bernoulli},
  year={2014},
  volume={23},
  pages={110-133}
}
Let $X,X_1,\dots, X_n,\dots$ be i.i.d. centered Gaussian random variables in a separable Banach space $E$ with covariance operator $\Sigma:$ $$ \Sigma:E^{\ast}\mapsto E,\ \ \Sigma u = {\mathbb E}\langle X,u\rangle, u\in E^{\ast}. $$ The sample covariance operator $\hat \Sigma:E^{\ast}\mapsto E$ is defined as $$ \hat \Sigma u := n^{-1}\sum_{j=1}^n \langle X_j,u\rangle X_j, u\in E^{\ast}. $$ The goal of the paper is to obtain concentration inequalities and expectation bounds for the operator norm… 
Asymptotics and Concentration Bounds for Bilinear Forms of Spectral Projectors of Sample Covariance
Let $X,X_1,\dots, X_n$ be i.i.d. Gaussian random variables with zero mean and covariance operator $\Sigma={\mathbb E}(X\otimes X)$ taking values in a separable Hilbert space ${\mathbb H}.$ Let $$
Normal approximation and concentration of spectral projectors of sample covariance
Let $X,X_1,\dots, X_n$ be i.i.d. Gaussian random variables in a separable Hilbert space ${\mathbb H}$ with zero mean and covariance operator $\Sigma={\mathbb E}(X\otimes X),$ and let $\hat
Asymptotically efficient estimation of smooth functionals of covariance operators
Let $X$ be a centered Gaussian random variable in a separable Hilbert space ${\mathbb H}$ with covariance operator $\Sigma.$ We study a problem of estimation of a smooth functional of $\Sigma$ based
New asymptotic results in principal component analysis
Let $X$ be a mean zero Gaussian random vector in a separable Hilbert space ${\mathbb H}$ with covariance operator $\Sigma:={\mathbb E}(X\otimes X).$ Let $\Sigma=\sum_{r\geq 1}\mu_r P_r$ be the
Estimation of smooth functionals in normal models: Bias reduction and asymptotic efficiency
TLDR
The crucial part of the construction of estimator $f_k(\hat \theta)$ is a bias reduction method studied in the paper for more general statistical models than normal.
Efficient estimation of smooth functionals in Gaussian shift models
We study a problem of estimation of smooth functionals of parameter $\theta $ of Gaussian shift model $$ X=\theta +\xi,\ \theta \in E, $$ where $E$ is a separable Banach space and $X$ is an
Bootstrapping the Operator Norm in High Dimensions: Error Estimation for Covariance Matrices and Sketching
TLDR
The main result shows that the bootstrap can approximate the distribution of T_n at the dimension-free rate of $n^{-\frac{\beta-1/2}{6\beta+4}}$, with respect to the Kolmogorov metric.
Bayesian inference for spectral projectors of the covariance matrix
Let $X_1, \ldots, X_n$ be i.i.d. sample in $\mathbb{R}^p$ with zero mean and the covariance matrix $\mathbf{\Sigma^*}$. The classical PCA approach recovers the projector $\mathbf{P^*_{\mathcal{J}}}$
Bootstrap confidence sets for spectral projectors of sample covariance
Let $$ X_{1},\ldots ,X_{n} $$X1,…,Xn be i.i.d. sample in $$ \mathbb {R}^{p} $$Rp with zero mean and the covariance matrix $$ \varvec{\Sigma }$$Σ. The problem of recovering the projector onto an
Efficient estimation of linear functionals of principal components
We study principal component analysis (PCA) for mean zero i.i.d. Gaussian observations $X_1,\dots, X_n$ in a separable Hilbert space $\mathbb{H}$ with unknown covariance operator $\Sigma.$ The
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 18 REFERENCES
Empirical Processes with a Bounded Ψ1 Diameter
TLDR
Several well-known results in Asymptotic Geometric Analysis are extended to any isotropic, log-concave ensemble on $${\mathbb{R}^n}$$ and optimal bounds on the random diameters are presented.
A note on the Hanson-Wright inequality for random vectors with dependencies
We prove that quadratic forms in isotropic random vectors $X$ in $\mathbb{R}^n$, possessing the convex concentration property with constant $K$, satisfy the Hanson-Wright inequality with constant
User-Friendly Tail Bounds for Sums of Random Matrices
  • J. Tropp
  • Mathematics
    Found. Comput. Math.
  • 2012
TLDR
This paper presents new probability inequalities for sums of independent, random, self-adjoint matrices and provides noncommutative generalizations of the classical bounds associated with the names Azuma, Bennett, Bernstein, Chernoff, Hoeffding, and McDiarmid.
Oracle inequalities in empirical risk minimization and sparse recovery problems
The purpose of these lecture notes is to provide an introduction to the general theory of empirical risk minimization with an emphasis on excess risk bounds and oracle inequalities in penalized
Non commutative Khintchine and Paley inequalities
considered as a norm on the set of all finitely supported sequences (x,,) in X. While a satisfactory solution seems hopeless at the moment for an arbitrary space X, there are cases for which the
Oracle inequalities and the isomorphic method
We use the isomorphic method to study exact and non-exact oracle inequalities for empirical risk minimization in classes of functions that satisfy a subgaussian condition. We show that the
Random Vectors in the Isotropic Position
Abstract Letybe a random vector in R n, satisfying E y⊗y=id. LetMbe a natural number and lety1, …, yMbe independent copies ofy. We study the question of approximation of the identity operator by
Introduction to the non-asymptotic analysis of random matrices
TLDR
This is a tutorial on some basic non-asymptotic methods and concepts in random matrix theory, particularly for the problem of estimating covariance matrices in statistics and for validating probabilistic constructions of measurementMatrices in compressed sensing.
Empirical processes and random projections
High-dimensional covariance matrix estimation with missing observations
TLDR
This paper establishes non-asymptotic sparsity oracle inequalities for the estimation of the covariance matrix with the Frobenius and spectral norms, valid for any setting of the sample size and the dimension of the observations.
...
1
2
...