Information theoretic limits for linear prediction with graph-structured sparsity
@article{Barik2017InformationTL, title={Information theoretic limits for linear prediction with graph-structured sparsity}, author={Adarsh Barik and Jean Honorio and Mohit Tawarmalani}, journal={2017 IEEE International Symposium on Information Theory (ISIT)}, year={2017}, pages={2348-2352} }
We analyze the necessary number of samples for sparse vector recovery in a noisy linear prediction setup. This model includes problems such as linear regression and classification. We focus on structured graph models. In particular, we prove that sufficient number of samples for the weighted graph model proposed by Hegde and others [2] is also necessary. We use the Fano's inequality [11] on well constructed ensembles as our main tool in establishing information theoretic lower bounds.
References
SHOWING 1-10 OF 20 REFERENCES
A Nearly-Linear Time Framework for Graph-Structured Sparsity
- Computer ScienceICML
- 2015
A framework for sparsity structures defined via graphs that is flexible and generalizes several previously studied sparsity models is introduced and achieves an information-theoretically optimal sample complexity for a wide range of parameters.
Information-theoretic bounds on model selection for Gaussian Markov random fields
- Computer Science, Mathematics2010 IEEE International Symposium on Information Theory
- 2010
The first result establishes a set of necessary conditions on n(p, d) for any recovery method to consistently estimate the underlying graph, and the second result provides necessary conditions for any decoder to produce an estimate of the true inverse covariance matrix T satisfying ‖ Θ̂-Θ ‖ < δin the elementwise ℓ∞-norm.
Information-Theoretic Limits of Selecting Binary Graphical Models in High Dimensions
- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2012
The information-theoretic limitations of the problem of graph selection for binary Markov random fields under high-dimensional scaling, in which the graph size and the number of edges k, and/or the maximal node degree d, are allowed to increase to infinity as a function of the sample size n, are analyzed.
A fast approximation algorithm for tree-sparse recovery
- Computer Science2014 IEEE International Symposium on Information Theory
- 2014
This work proposes an alternative approach to tree-sparse recovery that is based on a specific approximation algorithm for tree-projection and provably has a near-linear runtime of O(n log(kr)) and a memory cost ofO(n), where r is the dynamic range of the signal.
A Simple Proof of the Restricted Isometry Property for Random Matrices
- Mathematics
- 2008
Abstract
We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main…
Model-Based Compressive Sensing
- Computer ScienceIEEE Transactions on Information Theory
- 2010
A model-based CS theory is introduced that parallels the conventional theory and provides concrete guidelines on how to create model- based recovery algorithms with provable performance guarantees and a new class of structured compressible signals along with a new sufficient condition for robust structured compressable signal recovery that is the natural counterpart to the restricted isometry property of conventional CS.
Sample complexity for 1-bit compressed sensing and sparse classification
- Computer Science2010 IEEE International Symposium on Information Theory
- 2010
This paper considers the problem of identifying the support set of a high-dimensional sparse vector, from noise-corrupted 1-bit measurements. We present passive and adaptive algorithms for this…
One-Bit Compressed Sensing: Provable Support and Vector Recovery
- Computer ScienceICML
- 2013
This paper proposes two novel and efficient solutions based on two combinatorial structures: union free families of sets and expanders for support recovery and the first method to recover a sparse vector using a near optimal number of measurements.
Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity
- Computer ScienceArXiv
- 2008
The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct arbitrary sparse signals provided that the sensing matrix satisfies the restricted isometry property with a constant parameter.