# Information Theoretic Bounds for Compressed Sensing

@article{Aeron2010InformationTB, title={Information Theoretic Bounds for Compressed Sensing}, author={S. Aeron and Venkatesh Saligrama and M. Zhao}, journal={IEEE Transactions on Information Theory}, year={2010}, volume={56}, pages={5111-5130} }

In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, m , and SNR, to signal sparsity, k, distortion level… Expand

#### 167 Citations

An Information Theoretic Study for Noisy Compressed Sensing With Joint Sparsity Model-2

- Computer Science, Mathematics
- ArXiv
- 2016

It is shown that noisy JSM-2 may require less number of measurements than noisy MMV for reliable support set reconstruction, and is compared with the existing result of noisy multiple measurement vectors model. Expand

Information Theoretic Bounds for Sparse Reconstruction in Random Noise

- Computer Science
- IEEE Access
- 2019

From the analysis of the recovery performance, the lower band upper bound of the probability of error for CS is calculated and it is proved that perfect reconstruction of the signal vector is impossible if the corresponding conditions are not satisfied. Expand

The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2012

It is shown that recovery with an arbitrarily small but constant fraction of errors is, however, possible, and that in some cases computationally simple estimators are near-optimal. Expand

Information theoretic performance bounds for noisy compressive sensing

- Mathematics, Computer Science
- 2013 IEEE International Conference on Communications Workshops (ICC)
- 2013

The relationships of bit rate per dimension R(D)/N and M, N, and M/N are given and plotted, and both theoretical analysis and numerical results show that compressive sensing uses less number of bits to represent the same information compared to conventional information acquisition and reconstruction techniques. Expand

Compressive sensing bounds through a unifying framework for sparse models

- Mathematics, Computer Science
- 2013 IEEE International Conference on Acoustics, Speech and Signal Processing
- 2013

This work investigates the sample complexity of support recovery in sparse signal processing models, with special focus on two compressive sensing scenarios, and establishes sufficient conditions on the number of samples in order to successfully recover the K salient covariates. Expand

An Information-Theoretic Study for Joint Sparsity Pattern Recovery With Different Sensing Matrices

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2017

It is shown that noisy MMV with different sensing matrices may require fewer measurements for reliable support set reconstruction, under a sublinear sparsity regime in a low noise-level scenario. Expand

Enhancing the fundamental limits of sparsity pattern recovery

- Mathematics, Computer Science
- Digit. Signal Process.
- 2017

The results show that with the aid of the prior knowledge and using the new framework one can push the performance limits of the sparsity pattern recovery significantly. Expand

Performance bounds of compressed sensing recovery algorithms for sparse noisy signals

- Computer Science
- 2013 IEEE Wireless Communications and Networking Conference (WCNC)
- 2013

The analysis results show that OMP owns the better performance than the other three recovery algorithms under the noisy signal model and the effective way to restrain the impact of the noise is to choose the measurement matrix with low correlation between the columns or the rows. Expand

Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2013

It is shown that if the measurement rate and per-sample signal-to-noise ratio (SNR) are finite constants independent of the length of the vector, then the optimal sparsity pattern estimate will have a constant fraction of errors. Expand

Limits on Support Recovery With Probabilistic Models: An Information-Theoretic Framework

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2017

This paper takes a unified approach to support recovery problems, considering general probabilistic models relating a sparse data vector to an observation vector and provides general achievability and converse bounds characterizing the trade-off between the error probability and number of measurements. Expand

#### References

SHOWING 1-10 OF 62 REFERENCES

Information theoretic bounds to sensing capacity of sensor networks under fixed SNR

- Mathematics, Computer Science
- 2007 IEEE Information Theory Workshop
- 2007

This paper adopts an information theoretic framework and develops upper and lower bounds for sensing capacity, and extends Fano's inequality to incorporate distortion effects as well as continuous signal spaces to derive upper bounds to sensing capacity. Expand

Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models

- Mathematics
- 2008 Information Theory and Applications Workshop
- 2008

We consider the problem of recovering sparse phenomena from projections of noisy data, a topic of interest in compressed sensing. We describe the problem in terms of sensing capacity, which we define… Expand

Shannon-Theoretic Limits on Noisy Compressive Sampling

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2010

It is proved that O(L) (an asymptotically linear multiple of L) measurements are necessary and sufficient for signal recovery, whenever L grows linearly as a function of M. Expand

On sensing capacity of sensor networks for a class of linear observation models

- Mathematics
- 2007 IEEE/SP 14th Workshop on Statistical Signal Processing
- 2007

In this paper we derive fundamental information theoretic upper and lower bounds to sensing capacity of sensor networks for several classes of linear observation models under fixed SNR. We define… Expand

Sampling bounds for sparse support recovery in the presence of noise

- Mathematics, Computer Science
- 2008 IEEE International Symposium on Information Theory
- 2008

It is shown that an unbounded SNR is also a necessary condition for perfect recovery, but any fraction (less than one) of the support can be recovered with bounded SNP, which means that a finite rate per sample is sufficient for partial support recovery. Expand

Measurements vs. Bits: Compressed Sensing meets Information Theory

- Computer Science
- 2006

This work demonstrates that measurement noise is the crucial factor that dictates the number of measurements needed for reconstruction, and concisely captures the effect of measurement noise on the performance limits of signal reconstruction, thus enabling to benchmark the performance of specific reconstruction algorithms. Expand

Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 2009

For a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurementMatrices, this paper derives both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as aset of necessary conditions that any decoder must satisfy for exactSupport set recovery. Expand

Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2009

For a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurementMatrices, this paper derives both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as aset of necessary conditions that any decoder must satisfy for exactSupport set recovery. Expand

Thresholded Basis Pursuit: Support Recovery for Sparse and Approximately Sparse Signals

- Computer Science
- 2008

It is shown that the k largest coefficients of a non-sparse signal X can be recovered from m = O(k log n/k) random projections for certain classes of signals and has implications for approximately sparse problems. Expand

Denoising by Sparse Approximation: Error Bounds Based on Rate-Distortion Theory

- Mathematics, Computer Science
- EURASIP J. Adv. Signal Process.
- 2006

A new bound that depends on a new bound on approximating a Gaussian signal as a linear combination of elements of an overcomplete dictionary is given and asymptotic expressions reveal a critical input signal-to-noise ratio for signal recovery. Expand