• Corpus ID: 204877760

Lower Bounds for Compressed Sensing with Generative Models

@article{Kamath2019LowerBF,
  title={Lower Bounds for Compressed Sensing with Generative Models},
  author={Akshay Kamath and Sushrut Karmalkar and Eric Price},
  journal={ArXiv},
  year={2019},
  volume={abs/1912.02938}
}
The goal of compressed sensing is to learn a structured signal $x$ from a limited number of noisy linear measurements $y \approx Ax$. In traditional compressed sensing, "structure" is represented by sparsity in some known basis. Inspired by the success of deep learning in modeling images, recent work starting with~\cite{BJPD17} has instead considered structure to come from a generative model $G: \mathbb{R}^k \to \mathbb{R}^n$. We present two results establishing the difficulty of this latter… 

Robust compressed sensing using generative models

This paper proposes an algorithm inspired by the Median-of-Means (MOM) that guarantees recovery for heavy-tailed data, even in the presence of outliers, and results show the novel MOM-based algorithm enjoys the same sample complexity guarantees as ERM under sub-Gaussian assumptions.

Robust compressed sensing of generative models

This paper proposes an algorithm inspired by the Median-of-Means (MOM) that guarantees recovery for heavy-tailed data, even in the presence of outliers, and shows the predicted robustness.

Sample Complexity Lower Bounds for Compressive Sensing with Generative Models

  • Zhaoqiang LiuJ. Scarlett
  • Computer Science
    2020 International Conference on Signal Processing and Communications (SPCOM)
  • 2020
This paper establishes corresponding algorithm-independent lower bounds on the sample complexity using tools from minimax statistical analysis, and establishes that the scaling laws derived in (Bora et al., 2017) are optimal or near-optimal in the absence of further assumptions.

Sample Complexity Bounds for 1-bit Compressive Sensing and Binary Stable Embeddings with Generative Priors

It is demonstrated that the Binary $\epsilon$-Stable Embedding property, which characterizes the robustness of the reconstruction to measurement errors and noise, also holds for 1-bit compressive sensing with Lipschitz continuous generative models with sufficiently many Gaussian measurements.

Robust One-Bit Recovery via ReLU Generative Networks: Near-Optimal Statistical Rate and Global Landscape Analysis

It is proved that the ERM estimator in this new framework achieves a statistical rate of $m=\widetilde{\mathcal{O}}(kn \log d /\varepsilon^2)$ recovering any $G(x_0)$ uniformly up to an error $\varithm factors$ when the network is shallow.

Information-Theoretic Lower Bounds for Compressive Sensing With Generative Models

The scaling laws derived in (Bora 2017) are optimal or near-optimal in the absence of further assumptions on the sample complexity using tools from minimax statistical analysis.

The Generalized Lasso with Nonlinear Observations and Generative Priors

This paper provides a non-uniform recovery guarantee, and shows that this result can be extended to the uniform recovery guarantee under the assumption of a so-called local embedding property, which is satisfied by the 1-bit and censored Tobit models.

Towards Sample-Optimal Compressive Phase Retrieval with Sparse and Generative Priors

This paper provides recovery guarantees with near-optimal sample complexity for phase retrieval with generative priors, and proposes a practical spectral initialization method motivated by recent advances in deep generative models.

Optimal Sample Complexities for Compressed Sensing with Approximate Generative Priors

This work implements the conditional resampling estimator for deep generative priors using Langevin dynamics, and empirically finds that it produces accurate estimates with more diversity than MAP.

Instance-Optimal Compressed Sensing via Posterior Sampling

The posterior sampling estimator for deep generative priors is implemented using Langevin dynamics, and it is empirically found that it produces accurate estimates with more diversity than MAP.

References

SHOWING 1-8 OF 8 REFERENCES

Compressed Sensing using Generative Models

This work shows how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all, and proves that, if G is L-Lipschitz, then roughly O(k log L) random Gaussian measurements suffice for an l2/l2 recovery guarantee.

Information-Theoretic Lower Bounds for Compressive Sensing With Generative Models

The scaling laws derived in (Bora 2017) are optimal or near-optimal in the absence of further assumptions on the sample complexity using tools from minimax statistical analysis.

Constructing Small-Bias Sets from Algebraic-Geometric Codes

An explicit construction of an $\eps$-biased set over $k$ bits of size with parameters nearly matching the lower bound is given, giving binary error correcting codes beating the Gilbert-Varshamov bound.

Auto-Encoding Variational Bayes

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Generative Adversarial Nets

We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a

Lower bounds for sparse recovery

The bound holds even for the more general version of the problem, where <i>A</i> is a random variable, and the recovery algorithm is required to work for any fixed x with constant probability (over <i>, and the bound is tight.

Stable signal recovery from incomplete and inaccurate measurements

It is shown that it is possible to recover x0 accurately based on the data y from incomplete and contaminated observations.

On data structures and asymmetric communication complexity

This paper considers two-party communication complexity, the ``asymmetric case'', when the input sizes of the two players differ significantly, and derives two generally applicable methods of proving lower bounds and obtain several applications.