Compressed Sensing With Nonlinear Observations and Related Nonlinear Optimization Problems

@article{Blumensath2012CompressedSW,
  title={Compressed Sensing With Nonlinear Observations and Related Nonlinear Optimization Problems},
  author={Thomas Blumensath},
  journal={IEEE Transactions on Information Theory},
  year={2012},
  volume={59},
  pages={3466-3474}
}
  • T. Blumensath
  • Published 8 May 2012
  • Computer Science
  • IEEE Transactions on Information Theory
Nonconvex constraints are valuable regularizers in many optimization problems. In particular, sparsity constraints have had a significant impact on sampling theory, where they are used in compressed sensing and allow structured signals to be sampled far below the rate traditionally prescribed. Nearly, all of the theory developed for compressed sensing signal recovery assumes that samples are taken using linear measurements. In this paper, we instead address the compressed sensing recovery… 

The Geometry of Compressed Sensing

This chapter provides an introduction to union of subspaces interpretation and related geometrical concepts and will show how they can be used to develop algorithms to recover signals with given structures and allow theoretical results that characterise the performance of these algorithmic approaches.

Quadratic Basis Pursuit

This paper extends the classical compressive sensing framework to a second-order Taylor expansion of the nonlinearity and shows that the sparse signal can be recovered exactly when the sampling rate is sufficiently high, and presents efficient numerical algorithms to recover sparse signals in second- order nonlinear systems.

Nonlinear sampling for sparse recovery

A modified version of the proposed nonlinear sampling techniques which has strong links with spectral estimation methods and exhibits a more stable performance under noise and numerical errors is introduced.

Nonlinear compressed sensing based on composite mappings and its pointwise linearization

This paper proposes a new concept, namely, nonlinear CS based on composite mappings, and provides a special pointwise linearization method, which can turn the nonlinear composite mapping Phi, at each point in its domain, into an equivalent linear composite mapping.

On sparse recovery with Structured Noise under sensing constraints

Two iterative denoising algorithms are shown to enhance the quality of sparse recovery in presence of physical constraints by iteratively estimating and eliminating the non-linear term from the measurements.

Sparse Signal Recovery Under Sensing and Physical Hardware Constraints

This dissertation develops a more accurate measurement model with structured noise representing a known non-linear function of the input signal, obtained by leveraging side information about the sampling structure and devise iterative denoising algorithms shown to enhance the quality of sparse recovery in the presence of physical constraints.

One-Bit Compressive Sensing With Projected Subgradient Method Under Sparsity Constraints

This paper presents the convergence analysis of the binary iterative hard thresholding (BIHT) algorithm which is a state-of-the-art recovery algorithm in one-bit compressive sensing.

Performance bounds for Poisson compressed sensing using Variance Stabilization Transforms

  • Deepak GargAjit V. Rajwade
  • Computer Science, Mathematics
    2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2017
The analysis of reconstruction errors for compressed sensing under Poisson noise is challenging due to the signal dependent nature of the noise, and also because the Poisson negative log-likelihood

Global and Quadratic Convergence of Newton Hard-Thresholding Pursuit

The theory and algorithm, Newton Hard-Thresholding Pursuit (NHTP), are built and it is shown that NHTP is quadratically convergent under the standard assumption of restricted strong convexity and smoothness, and its global convergence to a stationary point under a weaker assumption.

Quasi-linear Compressed Sensing

This work formulate natural generalizations of the well-known restricted isometry property (RIP) toward nonlinear measurements, which allow them to prove both unique identifiability of sparse signals as well as the convergence of recovery algorithms to compute them efficiently.
...

References

SHOWING 1-10 OF 26 REFERENCES

Model-Based Compressive Sensing

A model-based CS theory is introduced that parallels the conventional theory and provides concrete guidelines on how to create model- based recovery algorithms with provable performance guarantees and a new class of structured compressible signals along with a new sufficient condition for robust structured compressable signal recovery that is the natural counterpart to the restricted isometry property of conventional CS.

Subspace Pursuit for Compressive Sensing Signal Reconstruction

The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct arbitrary sparse signals provided that the sensing matrix satisfies the restricted isometry property with a constant parameter.

Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity

The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct arbitrary sparse signals provided that the sensing matrix satisfies the restricted isometry property with a constant parameter.

Hard Thresholding Pursuit: An Algorithm for Compressive Sensing

  • S. Foucart
  • Computer Science
    SIAM J. Numer. Anal.
  • 2011
A new iterative algorithm to find sparse solutions of underdetermined linear systems is introduced and it is shown that, under a certain condition on the restricted isometry constant of the matrix of the linear system, the Hard Thresholding Pursuit algorithm indeed finds all $s$-sparse solutions.

Greedy sparsity-constrained optimization

  • S. BahmaniB. RajP. Boufounos
  • Computer Science
    2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR)
  • 2011
This paper presents a greedy algorithm, dubbed Gradient Support Pursuit (GraSP), for sparsity-constrained optimization, and quantifiable guarantees are provided for GraSP when cost functions have the “Stable Hessian Property”.

Sparse Recovery from Nonlinear Measurements with Applications in Bad Data Detection for Power Networks

This paper numerically evaluates an iterative convex programming approach to perform bad data detections in nonlinear electrical power networks problems and provides sharp bounds on the almost Euclidean property of a linear subspace using the “escape-through-a-mesh” theorem from geometric functional analysis.

Convergence of Fixed-Point Continuation Algorithms for Matrix Rank Minimization

The convergence/recoverability properties of the fixed-point continuation algorithm and its variants for matrix rank minimization are studied and heuristics for determining the rank of the matrix when its true rank is not known are proposed.

Sampling and Reconstructing Signals From a Union of Linear Subspaces

  • T. Blumensath
  • Mathematics, Computer Science
    IEEE Transactions on Information Theory
  • 2011
This paper considers a very general setting in which infinitely many subspaces in infinite dimensional Hilbert spaces are allowed, which allows many results derived recently in areas such as compressed sensing, affine rank minimization, analog compressed sensing and structured matrix decompositions to be unified.