# Compressed Sensing with Adversarial Sparse Noise via L1 Regression

@inproceedings{Karmalkar2018CompressedSW,
title={Compressed Sensing with Adversarial Sparse Noise via L1 Regression},
author={Sushrut Karmalkar and Eric Price},
booktitle={SIAM Symposium on Simplicity in Algorithms},
year={2018}
}
• Published in
SIAM Symposium on Simplicity…
1 September 2018
• Computer Science
We present a simple and effective algorithm for the problem of \emph{sparse robust linear regression}. In this problem, one would like to estimate a sparse vector $w^* \in \mathbb{R}^n$ from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen $\eta$ fraction of measured responses $y$, as well as introduce bounded norm noise to the responses. For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate…
30 Citations

## Figures from this paper

### Low rank matrix recovery with adversarial sparse noise

• Computer Science
Inverse Problems
• 2021
It is shown that the nuclear-norm constrained least absolute deviation (LAD) can successfully estimate the ground-truth matrix for any ω < 0.239, and robust recovery results are established for an iterative hard thresholding algorithm applied to the rank-constrained LAD considering geometrically decaying step-sizes.

### Reconstruction under outliers for Fourier-sparse functions

• Computer Science, Mathematics
SODA
• 2020
Over the torus, assuming that the Fourier transform satisfies a certain \emph{granularity} condition, there is a sample efficient algorithm to tolerate $\rho =\Omega(1)$ fraction of outliers and further, that this is not possible without such a granularity condition.

### Online Robust Regression via SGD on the l1 loss

• Computer Science, Mathematics
NeurIPS
• 2020
It is shown in this work that stochastic gradient descent on the $\ell_1$ loss converges to the true parameter vector at a $\tilde{O}( 1 / (1 - \eta)^2 n )$ rate which is independent of the values of the contaminated measurements.

### Outlier-robust sparse/low-rank least-squares regression and robust matrix completion

We study high-dimensional least-squares regression within a subgaussian statistical learning framework with heterogeneous noise. It includes $s$-sparse and $r$-low-rank least-squares regression when

### Outlier-robust estimation of a sparse linear model using 𝓁1-penalized Huber's M-estimator

• Computer Science, Mathematics
NeurIPS
• 2019
It is proved that the $\ell_1$-penalized Huber's M-estimator based on $n$ samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.

### Regress Consistently when Oblivious Outliers Overwhelm

• Computer Science, Mathematics
ArXiv
• 2020
In the special case of Gaussian design, it is shown that a strikingly simple algorithm based on computing coordinate-wise medians achieves similar guarantees in linear time and extends to the settings where the parameter vector $\beta^*$ is sparse.

### List-Decodable Linear Regression

• Computer Science
NeurIPS
• 2019
This work gives the first polynomial-time algorithm for robust regression in the list-decodable setting where an adversary can corrupt a greater than $1/2$ fraction of examples and proves that the anti-concentration assumption on the inliers is information-theoretically necessary.

### Outlier-robust estimation of a sparse linear model using $\ell_1$-penalized Huber's $M$-estimator

It is proved that the `1-penalized Huber’s M -estimator based on n samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.

### J ul 2 01 9 Reconstruction under outliers for Fourier-sparse functions

It is shown that over the torus, assuming that the Fourier transform satisfies a certain granularity condition, there is a sample efficient algorithm to tolerate ρ = Ω(1) fraction of outliers and further, that this is not possible without such a granular condition.

### Recovery guarantees for polynomial approximation from dependent data with outliers

• Mathematics, Computer Science
ArXiv
• 2018
The main contribution of this paper is to provide a reconstruction guarantee for the associated $\ell_1$-optimization problem where the sampling matrix is formed from dependent data, and proves that the sampled matrix satisfies the null space property and the stablenull space property.

## References

SHOWING 1-10 OF 20 REFERENCES

### Efficient Algorithms and Lower Bounds for Robust Linear Regression

• Computer Science, Mathematics
SODA
• 2019
Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$.

### Robust Regression via Hard Thresholding

• Computer Science
NIPS
• 2015
A simple hard-thresholding algorithm called TORRENT is studied which, under mild conditions on X, can recover w* exactly even if b corrupts the response variables in an adversarial manner, i.e. both the support and entries of b are selected adversarially after observing X and w*.

### Compressed Sensing and Matrix Completion with Constant Proportion of Corruptions

It is proved that one can recover an n×n low-rank matrix from m corrupted sampled entries by tractable optimization provided the rank is on the order of O(m/(nlog2n)); again, this holds when there is a positive fraction of corrupted samples.

### Corrupted Sensing: Novel Guarantees for Separating Structured Signals

• Computer Science
IEEE Transactions on Information Theory
• 2014
This work analyzes both penalized programs that tradeoff between signal and corruption complexity, and constrained programs that bound the complexity of signal or corruption when prior information is available, and provides new interpretable bounds for the Gaussian complexity of sparse vectors, block-sparse vectors, and low-rank matrices.

### Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice

• Computer Science
2009 Conference Record of the Forty-Third Asilomar Conference on Signals, Systems and Computers
• 2009
It is demonstrated that a simple algorithm, which is dubbed Justice Pursuit (JP), can achieve exact recovery from measurements corrupted with sparse noise.

### High Dimensional Robust Sparse Regression

• Computer Science
AISTATS
• 2020
A filtering algorithm which consists of a novel randomized outlier removal technique for robust sparse mean estimation that may be of interest in its own right: the filtering algorithm is flexible enough to deal with unknown covariance.

### The price of privacy and the limits of LP decoding

• Computer Science
STOC '07
• 2007
The principal result is the discovery of a sharp threshhold ρ*∠ 0.239, which says that any privacy mechanism, interactive or non-interactive, providing reasonably accurate answers to a 0.761 fraction of randomly generated weighted subset sum queries, is blatantly non-private.

### Exact Recoverability From Dense Corrupted Observations via $\ell _{1}$-Minimization

• Computer Science
IEEE Transactions on Information Theory
• 2013
It is confirmed that stable recovery is possible when measurements are polluted by both gross sparse and small dense errors, and shown that with high probability, ℓ<sub>1</sub>-minimization can recover the sparse signal of interest.

### Consistent Robust Regression

• Computer Science, Mathematics
NIPS
• 2017
It is shown that CRR not only offers consistent estimates, but is empirically far superior to several other recently proposed algorithms for the robust regression problem, including extended Lasso and the TORRENT algorithm.

### Sever: A Robust Meta-Algorithm for Stochastic Optimization

• Computer Science
ICML
• 2019
This work introduces a new meta-algorithm that can take in a base learner such as least squares or stochastic gradient descent, and harden the learner to be resistant to outliers, and finds that in both cases it has substantially greater robustness than several baselines.