• Corpus ID: 231985902

# Adversarial robust weighted Huber regression

@inproceedings{Sasai2021AdversarialRW,
author={Takeyuki Sasai and Hironori Fujisawa},
year={2021}
}
• Published 22 February 2021
• Computer Science, Mathematics
We consider a robust estimation of linear regression coeﬃcients. In this note, we focus on the case where the covariates are sampled from an L -subGaussian distribution with unknown covariance, the noises are sampled from a distribution with a bounded absolute moment and both covariates and noises may be contaminated by an adversary. We derive an estimation error bound, which depends on the stable rank and the condition number of the covariance matrix of covariates with a polynomial…

## References

SHOWING 1-10 OF 42 REFERENCES

• Computer Science, Mathematics
NeurIPS
• 2019
It is proved that the $\ell_1$-penalized Huber's M-estimator based on $n$ samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.
• Computer Science, Mathematics
The Annals of Statistics
• 2022
It is shown that a single robust estimator of the mean of a multivariate Gaussian distribution can enjoy five desirable properties and can be extended to sub-Gaussian distributions, as well as to the cases of unknown rate of contamination or unknown covariance matrix.
• Computer Science, Mathematics
SODA
• 2019
This work gives the first nearly-linear time algorithms for high-dimensional robust mean estimation on distributions with known covariance and sub-gaussian tails and unknown bounded covariance, and exploits the special structure of the corresponding SDPs to show that they are approximately solvable in nearly- linear time.
• Computer Science, Mathematics
SODA
• 2019
Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$.
Estimation of the covariance matrix has attracted a lot of attention of the statistical research community over the years, partially due to important applications such as Principal Component
• Computer Science, Mathematics
ArXiv
• 2020
These estimators and their analysis leverage recent developments in the construction of faster algorithms for robust mean estimation to improve runtimes, and refined concentration of measure arguments alongside Gaussian rounding techniques to improve statistical sample complexities.
• Computer Science, Mathematics
SODA
• 2018
This work gives robust estimators that achieve estimation error $O(\varepsilon)$ in the total variation distance, which is optimal up to a universal constant that is independent of the dimension.
• Computer Science
ICML
• 2020
This work establishes an intriguing connection between algorithmic high-dimensional robust statistics and non-convex optimization, which may have broader applications to other robust estimation tasks.
• Computer Science
COLT
• 2019
This work studies the problem of estimating the covariance matrix of a high-dimensional distribution when a small constant fraction of the samples can be arbitrarily corrupted and develops faster algorithms for this problem whose running time nearly matches that of computing the empirical covariance.
• Computer Science, Mathematics
NeurIPS
• 2020
This work obtains the first computationally efficient algorithm with subgaussian rate for outlier-robust mean estimation in the strong contamination model under a finite covariance assumption.