Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach

Abstract

This paper develops theoretical results regarding noisy 1-bit compressed sensing and sparse binomial regression. We demonstrate that a single convex program gives an accurate estimate of the signal, or coefficient vector, for both of these models. We show that an -sparse signal in can be accurately estimated from m = O(s log(n/s)) single-bit measurements using a simple convex program. This remains true even if each measurement bit is flipped with probability nearly 1/2. Worst-case (adversarial) noise can also be accounted for, and uniform results that hold for all sparse inputs are derived as well. In the terminology of sparse logistic regression, we show that O (s log (2n/s)) Bernoulli trials are sufficient to estimate a coefficient vector in which is approximately -sparse. Moreover, the same convex program works for virtually all generalized linear models, in which the link function may be unknown. To our knowledge, these are the first results that tie together the theory of sparse logistic regression to 1-bit compressed sensing. Our results apply to general signal structures aside from sparsity; one only needs to know the size of the set where signals reside. The size is given by the mean width of K, a computable quantity whose square serves as a robust extension of the dimension.

DOI: 10.1109/TIT.2012.2207945

Extracted Key Phrases

1 Figure or Table

020406080201220132014201520162017
Citations per Year

199 Citations

Semantic Scholar estimates that this publication has 199 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Plan2013Robust1C, title={Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach}, author={Yaniv Plan and Roman Vershynin}, journal={IEEE Transactions on Information Theory}, year={2013}, volume={59}, pages={482-494} }