# Estimating Principal Components under Adversarial Perturbations

@inproceedings{Awasthi2020EstimatingPC, title={Estimating Principal Components under Adversarial Perturbations}, author={Pranjal Awasthi and Xue Chen and Aravindan Vijayaraghavan}, booktitle={COLT}, year={2020} }

Robustness is a key requirement for widespread deployment of machine learning algorithms, and has received much attention in both statistics and computer science. We study a natural model of robustness for high-dimensional statistical estimation problems that we call the adversarial perturbation model. An adversary can perturb every sample arbitrarily up to a specified magnitude $\delta$ measured in some $\ell_q$ norm, say $\ell_\infty$. Our model is motivated by emerging paradigms such as low…

## 2 Citations

### Understanding Simultaneous Train and Test Robustness

- Computer ScienceALT
- 2022

This work shows that the two seemingly different notions of robustness at train-time and test-time are closely related, and this connection can be leveraged to develop algorithmic techniques that are applicable in both the settings.

### Adversarially robust subspace learning in the spiked covariance model

- Computer ScienceStat. Anal. Data Min.
- 2022

This work derives the adversarial projection risk when data follows the multivariate Gaussian distribution with the spiked covariance, or so‐called the Spiked Covariance model, and finds an upper bound of the empirical risk to find the robust subspace for the general spike covariance model.

## References

SHOWING 1-10 OF 73 REFERENCES

### Squared-Norm Empirical Process in Banach Space

- Mathematics
- 2013

This note extends a recent result of Mendelson on the supremum of a quadratic process to squared norms of functions taking values in a Banach space. Our method of proof is a reduction by a…

### Adversarially Robust Low Dimensional Representations

- Computer ScienceCOLT
- 2021

This work forms a natural extension of Principal Component Analysis (PCA) where the goal is to find a low dimensional subspace to represent the given data with minimum projection error, and that is in addition robust to small perturbations measured in $\ell_q$ norm.

### On Robustness to Adversarial Examples and Polynomial Optimization

- Computer Science, MathematicsNeurIPS
- 2019

The main contribution of this work is to exhibit a strong connection between achieving robustness to adversarial examples, and a rich class of polynomial optimization problems, thereby making progress on the above questions.

### High Dimensional Probability

- Mathematics
- 2006

About forty years ago it was realized by several researchers that the essential features of certain objects of Probability theory, notably Gaussian processes and limit theorems, may be better…

### Sever: A Robust Meta-Algorithm for Stochastic Optimization

- Computer ScienceICML
- 2019

This work introduces a new meta-algorithm that can take in a base learner such as least squares or stochastic gradient descent, and harden the learner to be resistant to outliers, and finds that in both cases it has substantially greater robustness than several baselines.

### Learning geometric concepts with nasty noise

- Computer ScienceSTOC
- 2018

The first polynomial-time PAC learning algorithms for low-degree PTFs and intersections of halfspaces with dimension-independent error guarantees in the presence of nasty noise under the Gaussian distribution are given.

### Robustly Learning a Gaussian: Getting Optimal Error, Efficiently

- Computer Science, MathematicsSODA
- 2018

This work gives robust estimators that achieve estimation error $O(\varepsilon)$ in the total variation distance, which is optimal up to a universal constant that is independent of the dimension.

### Tighten after Relax: Minimax-Optimal Sparse PCA in Polynomial Time

- Computer ScienceNIPS
- 2014

This paper proposes a two-stage sparse PCA procedure that attains the optimal principal subspace estimator in polynomial time and motivates a general paradigm of tackling nonconvex statistical learning problems with provable statistical guarantees.

### Complexity Theoretic Lower Bounds for Sparse Principal Component Detection

- Computer Science, MathematicsCOLT
- 2013

The performance of a test is measured by the smallest signal strength that it can detect and a computationally efficient method based on semidefinite programming is proposed and it is proved that the statistical performance of this test cannot be strictly improved by any computationallyefficient method.

### Coloring Random and Semi-Random k-Colorable Graphs

- Mathematics, Computer ScienceJ. Algorithms
- 1995

Algorithms that color randomly generated k -colorable graphs for much lower edge densities than previous approaches are presented and it is shown that even for quite low noise rates, semi-random k - colorable graphs can be optimally colored with high probability.