• Corpus ID: 248965435

Estimation of smooth functionals of covariance operators: jackknife bias reduction and bounds in terms of effective rank

@inproceedings{Koltchinskii2022EstimationOS,
  title={Estimation of smooth functionals of covariance operators: jackknife bias reduction and bounds in terms of effective rank},
  author={Vladimir Koltchinskii},
  year={2022}
}
: Let E be a separable Banach space and let X,X 1 ,...,X n ,... be i.i.d. Gaussian random variables taking values in E with mean zero and unknown covariance operator Σ : E ∗ 7→ E. The complexity of estimation of Σ based on observations X 1 ,...,X n is naturally characterized by the so called effective rank of Σ : r (Σ) := E Σ k X k 2 k Σ k , where k Σ k is the operator norm of Σ . Given a smooth real valued functional f defined on the space L ( E ∗ ,E ) of symmetric linear operators from E ∗ into… 

References

SHOWING 1-10 OF 32 REFERENCES
Estimation of Smooth Functionals of Location Parameter in Gaussian and Poincaré Random Shift Models
Let E be a separable Banach space and let $f:E\mapsto {\mathbb {R}}$ be a smooth functional. We discuss a problem of estimation of f(𝜃) based on an observation X = 𝜃 + ξ, where 𝜃 ∈ E is an
Estimation of Integral Functionals of a Density
Let φ be a smooth function of k + 2 variables. We shall investigate in this paper the rates of convergence of estimators of T(f)= fφ(f(x), f'(x),..., f (k) (x), x) dx when f belongs to some class of
Functional estimation in log-concave location families
Abstract: Let {Pθ : θ ∈ R} be a log-concave location family with Pθ(dx) = e −V (x−θ)dx, where V : R 7→ R is a known convex function and let X1, . . . , Xn be i.i.d. r.v. sampled from distribution Pθ
Statistical estimation : asymptotic theory
when certain parameters in the problem tend to limiting values (for example, when the sample size increases indefinitely, the intensity of the noise ap proaches zero, etc.) To address the problem of
Efficient estimation of linear functionals of principal components
We study principal component analysis (PCA) for mean zero i.i.d. Gaussian observations $X_1,\dots, X_n$ in a separable Hilbert space $\mathbb{H}$ with unknown covariance operator $\Sigma.$ The
Normal approximation and concentration of spectral projectors of sample covariance
Let $X,X_1,\dots, X_n$ be i.i.d. Gaussian random variables in a separable Hilbert space ${\mathbb H}$ with zero mean and covariance operator $\Sigma={\mathbb E}(X\otimes X),$ and let $\hat
Estimation of smooth functionals in high-dimensional models: bootstrap chains and Gaussian approximation.
TLDR
A functional is constructed that is an asymptotically normal estimator of f(\theta) with general upper bounds on Orlicz norm error rates for estimator g depending on smoothness, dimension $d,$ sample size $n, and the accuracy of normal approximation of $\sqrt{n}(\hat \theta_n-\theta).
Higher order influence functions and minimax estimation of nonlinear functionals
We present a theory of point and interval estimation for nonlinear functionals in parametric, semi-, and non-parametric models based on higher order influence functions (Robins (2004), Section 9; Li
Asymptotics and Concentration Bounds for Bilinear Forms of Spectral Projectors of Sample Covariance
Let $X,X_1,\dots, X_n$ be i.i.d. Gaussian random variables with zero mean and covariance operator $\Sigma={\mathbb E}(X\otimes X)$ taking values in a separable Hilbert space ${\mathbb H}.$ Let $$
Efficient estimation of smooth functionals in Gaussian shift models
We study a problem of estimation of smooth functionals of parameter $\theta $ of Gaussian shift model $$ X=\theta +\xi,\ \theta \in E, $$ where $E$ is a separable Banach space and $X$ is an
...
...