• Corpus ID: 252545206

Improved covariance estimation: optimal robustness and sub-Gaussian guarantees under heavy tails

@inproceedings{Oliveira2022ImprovedCE,
  title={Improved covariance estimation: optimal robustness and sub-Gaussian guarantees under heavy tails},
  author={Roberto I. Oliveira and Zoraida F. Rico},
  year={2022}
}
We present an estimator of the covariance matrix Σ of random d -dimensional vector from an i.i.d. sample of size n . Our sole assumption is that this vector satisfies a bounded L p − L 2 moment assumption over its one-dimensional marginals, for some p ≥ 4. Given this, we show that Σ can be estimated from the sample with the same high-probability error rates that the sample covariance matrix achieves in the case of Gaussian data. This holds even though we allow for very general distributions that… 
1 Citations

References

SHOWING 1-10 OF 34 REFERENCES

Covariance Estimation: Optimal Dimension-free Guarantees for Adversarial Corruption and Heavy Tails

A dimension-free Bai-Yin type theorem in the regime p > 4 is proved, and despite requiring the existence of only a few moments, the estimator achieves the same tail estimates as if the underlying distribution were Gaussian.

Robust Estimators in High Dimensions without the Computational Intractability

This work obtains the first computationally efficient algorithms for agnostically learning several fundamental classes of high-dimensional distributions: a single Gaussian, a product distribution on the hypercube, mixtures of two product distributions (under a natural balancedness condition), and k Gaussians with identical spherical covariances.

Robust and Heavy-Tailed Mean Estimation Made Simple, via Regret Minimization

The meta-problem can be solved either by a variant of the Filter algorithm from the recent literature on robust estimation or by the quantum entropy scoring scheme (QUE), due to Dong, Hopkins and Li, by leveraging the duality theorem.

Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries

Estimation of the covariance matrix has attracted a lot of attention of the statistical research community over the years, partially due to important applications such as Principal Component

Robust covariance estimation under $L_{4}-L_{2}$ norm equivalence

Abstract Let X be a centered random vector taking values in R and let Σ = E(X ⊗X) be its covariance matrix. We show that if X satisfies an L4 − L2 norm equivalence (sometimes referred to as the

Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles

Let K be an isotropic convex body in Rn. Given e > 0, how many independent points Xi uniformly distributed on K are neededfor the empirical covariance matrix to approximate the identity up to e with

Mean estimation with sub-Gaussian rates in polynomial time

This work offers the first polynomial time algorithm to estimate the mean with sub-Gaussian-size confidence intervals under such mild assumptions, based on a new semidefinite programming relaxation of a high-dimensional median.

The lower tail of random quadratic forms with applications to ordinary least squares

This paper proves that the “lower tail” of such a matrix is sub-Gaussian under a simple fourth moment assumption on the one-dimensional marginals of the random vectors, and obtains a nearly optimal finite-sample result for the ordinary least squares estimator under random design.

Robust linear least squares regression

A new estimator is provided based on truncating differences of losses in a min-max framework and satisfies a d/n risk bound both in expectation and in deviations, which is the absence of exponential moment condition on the output distribution while achieving exponential deviations.

Robust modifications of U-statistics and applications to covariance estimation problems

Let $Y$ be a $d$-dimensional random vector with unknown mean $\mu$ and covariance matrix $\Sigma$. This paper is motivated by the problem of designing an estimator of $\Sigma$ that admits tight