Compressed Sensing with Adversarial Sparse Noise via L1 Regression

@inproceedings{Karmalkar2018CompressedSW,
  title={Compressed Sensing with Adversarial Sparse Noise via L1 Regression},
  author={Sushrut Karmalkar and Eric Price},
  booktitle={SIAM Symposium on Simplicity in Algorithms},
  year={2018}
}
We present a simple and effective algorithm for the problem of \emph{sparse robust linear regression}. In this problem, one would like to estimate a sparse vector $w^* \in \mathbb{R}^n$ from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen $\eta$ fraction of measured responses $y$, as well as introduce bounded norm noise to the responses. For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate… 

Figures from this paper

Low rank matrix recovery with adversarial sparse noise

It is shown that the nuclear-norm constrained least absolute deviation (LAD) can successfully estimate the ground-truth matrix for any ω < 0.239, and robust recovery results are established for an iterative hard thresholding algorithm applied to the rank-constrained LAD considering geometrically decaying step-sizes.

Reconstruction under outliers for Fourier-sparse functions

Over the torus, assuming that the Fourier transform satisfies a certain \emph{granularity} condition, there is a sample efficient algorithm to tolerate $\rho =\Omega(1)$ fraction of outliers and further, that this is not possible without such a granularity condition.

Online Robust Regression via SGD on the l1 loss

It is shown in this work that stochastic gradient descent on the $\ell_1$ loss converges to the true parameter vector at a $\tilde{O}( 1 / (1 - \eta)^2 n )$ rate which is independent of the values of the contaminated measurements.

Outlier-robust sparse/low-rank least-squares regression and robust matrix completion

We study high-dimensional least-squares regression within a subgaussian statistical learning framework with heterogeneous noise. It includes $s$-sparse and $r$-low-rank least-squares regression when

Outlier-robust estimation of a sparse linear model using 𝓁1-penalized Huber's M-estimator

It is proved that the $\ell_1$-penalized Huber's M-estimator based on $n$ samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.

Regress Consistently when Oblivious Outliers Overwhelm

In the special case of Gaussian design, it is shown that a strikingly simple algorithm based on computing coordinate-wise medians achieves similar guarantees in linear time and extends to the settings where the parameter vector $\beta^*$ is sparse.

List-Decodable Linear Regression

This work gives the first polynomial-time algorithm for robust regression in the list-decodable setting where an adversary can corrupt a greater than $1/2$ fraction of examples and proves that the anti-concentration assumption on the inliers is information-theoretically necessary.

Outlier-robust estimation of a sparse linear model using $\ell_1$-penalized Huber's $M$-estimator

It is proved that the `1-penalized Huber’s M -estimator based on n samples attains the optimal rate of convergence, up to a logarithmic factor, in the case where the labels are contaminated by at most adversarial outliers.

J ul 2 01 9 Reconstruction under outliers for Fourier-sparse functions

It is shown that over the torus, assuming that the Fourier transform satisfies a certain granularity condition, there is a sample efficient algorithm to tolerate ρ = Ω(1) fraction of outliers and further, that this is not possible without such a granular condition.

Recovery guarantees for polynomial approximation from dependent data with outliers

The main contribution of this paper is to provide a reconstruction guarantee for the associated $\ell_1$-optimization problem where the sampling matrix is formed from dependent data, and proves that the sampled matrix satisfies the null space property and the stablenull space property.

References

SHOWING 1-10 OF 20 REFERENCES

Efficient Algorithms and Lower Bounds for Robust Linear Regression

Any polynomial time SQ learning algorithm for robust linear regression (in Huber's contamination model) with estimation complexity, must incur an error of $\Omega(\sqrt{\epsilon} \sigma)$.

Robust Regression via Hard Thresholding

A simple hard-thresholding algorithm called TORRENT is studied which, under mild conditions on X, can recover w* exactly even if b corrupts the response variables in an adversarial manner, i.e. both the support and entries of b are selected adversarially after observing X and w*.

Compressed Sensing and Matrix Completion with Constant Proportion of Corruptions

It is proved that one can recover an n×n low-rank matrix from m corrupted sampled entries by tractable optimization provided the rank is on the order of O(m/(nlog2n)); again, this holds when there is a positive fraction of corrupted samples.

Corrupted Sensing: Novel Guarantees for Separating Structured Signals

This work analyzes both penalized programs that tradeoff between signal and corruption complexity, and constrained programs that bound the complexity of signal or corruption when prior information is available, and provides new interpretable bounds for the Gaussian complexity of sparse vectors, block-sparse vectors, and low-rank matrices.

Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice

It is demonstrated that a simple algorithm, which is dubbed Justice Pursuit (JP), can achieve exact recovery from measurements corrupted with sparse noise.

High Dimensional Robust Sparse Regression

A filtering algorithm which consists of a novel randomized outlier removal technique for robust sparse mean estimation that may be of interest in its own right: the filtering algorithm is flexible enough to deal with unknown covariance.

The price of privacy and the limits of LP decoding

The principal result is the discovery of a sharp threshhold ρ*∠ 0.239, which says that any privacy mechanism, interactive or non-interactive, providing reasonably accurate answers to a 0.761 fraction of randomly generated weighted subset sum queries, is blatantly non-private.

Exact Recoverability From Dense Corrupted Observations via $\ell _{1}$-Minimization

It is confirmed that stable recovery is possible when measurements are polluted by both gross sparse and small dense errors, and shown that with high probability, ℓ<sub>1</sub>-minimization can recover the sparse signal of interest.

Consistent Robust Regression

It is shown that CRR not only offers consistent estimates, but is empirically far superior to several other recently proposed algorithms for the robust regression problem, including extended Lasso and the TORRENT algorithm.

Sever: A Robust Meta-Algorithm for Stochastic Optimization

This work introduces a new meta-algorithm that can take in a base learner such as least squares or stochastic gradient descent, and harden the learner to be resistant to outliers, and finds that in both cases it has substantially greater robustness than several baselines.