• Corpus ID: 231985902

Adversarial robust weighted Huber regression

  title={Adversarial robust weighted Huber regression},
  author={Takeyuki Sasai and Hironori Fujisawa},
We consider a robust estimation of linear regression coefficients. In this note, we focus on the case where the covariates are sampled from an L -subGaussian distribution with unknown covariance, the noises are sampled from a distribution with a bounded absolute moment and both covariates and noises may be contaminated by an adversary. We derive an estimation error bound, which depends on the stable rank and the condition number of the covariance matrix of covariates with a polynomial… 



Outlier-robust estimation of a sparse linear model using 𝓁1-penalized Huber's M-estimator

It is proved that the $\ell_1$-penalized Huber's M-estimator based on $n$ samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.

All-in-one robust estimator of the Gaussian mean

It is shown that a single robust estimator of the mean of a multivariate Gaussian distribution can enjoy five desirable properties and can be extended to sub-Gaussian distributions, as well as to the cases of unknown rate of contamination or unknown covariance matrix.

High-Dimensional Robust Mean Estimation in Nearly-Linear Time

This work gives the first nearly-linear time algorithms for high-dimensional robust mean estimation on distributions with known covariance and sub-gaussian tails and unknown bounded covariance, and exploits the special structure of the corresponding SDPs to show that they are approximately solvable in nearly- linear time.

Efficient Algorithms and Lower Bounds for Robust Linear Regression

Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$.

Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries

Estimation of the covariance matrix has attracted a lot of attention of the statistical research community over the years, partially due to important applications such as Principal Component

Optimal Robust Linear Regression in Nearly Linear Time

These estimators and their analysis leverage recent developments in the construction of faster algorithms for robust mean estimation to improve runtimes, and refined concentration of measure arguments alongside Gaussian rounding techniques to improve statistical sample complexities.

Robustly Learning a Gaussian: Getting Optimal Error, Efficiently

This work gives robust estimators that achieve estimation error $O(\varepsilon)$ in the total variation distance, which is optimal up to a universal constant that is independent of the dimension.

High-Dimensional Robust Mean Estimation via Gradient Descent

This work establishes an intriguing connection between algorithmic high-dimensional robust statistics and non-convex optimization, which may have broader applications to other robust estimation tasks.

Faster Algorithms for High-Dimensional Robust Covariance Estimation

This work studies the problem of estimating the covariance matrix of a high-dimensional distribution when a small constant fraction of the samples can be arbitrarily corrupted and develops faster algorithms for this problem whose running time nearly matches that of computing the empirical covariance.

Outlier Robust Mean Estimation with Subgaussian Rates via Stability

This work obtains the first computationally efficient algorithm with subgaussian rate for outlier-robust mean estimation in the strong contamination model under a finite covariance assumption.