Figures from this paper
35 References
Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints
- Computer ScienceSIAM J. Optim.
- 2010
This work studies the problem of minimizing the expected loss of a linear predictor while constraining its sparsity, i.e., bounding the number of features used by the predictor, and analyze the performance of several approximation algorithms.
Subspace Pursuit for Compressive Sensing Signal Reconstruction
- Computer ScienceIEEE Transactions on Information Theory
- 2009
The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct arbitrary sparse signals provided that the sensing matrix satisfies the restricted isometry property with a constant parameter.
Orthogonal Matching Pursuit with Replacement
- Computer ScienceNIPS
- 2011
This paper proposes a novel partial hard-thresholding operator that leads to a general family of iterative algorithms that includes Orthogonal Matching Pursuit with Replacement (OMPR), and extends OMPR using locality sensitive hashing to get OMPR-Hash, the first provably sub-linear algorithm for sparse recovery.
Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Computer ScienceIEEE Transactions on Information Theory
- 2011
It is shown that under conditions on the mutual incoherence and the minimum magnitude of the nonzero components of the signal, the support of the signals can be recovered exactly by the OMP algorithm with high probability.
CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Computer ScienceCommun. ACM
- 2010
This extended abstract describes a recent algorithm, called, CoSaMP, that accomplishes the data recovery task and was the first known method to offer near-optimal guarantees on resource usage.
Exact Recovery of Hard Thresholding Pursuit
- Computer ScienceNIPS
- 2016
This paper shows, for the first time, that exact recovery of the global sparse minimizer is possible for HTP-style methods under restricted strong condition number bounding conditions and is able to recover the support of certain relaxed sparse solutions without assuming bounded restrictedStrong condition number.
Global optimization for sparse solution of least squares problems
- Computer ScienceOptim. Methods Softw.
- 2022
This work proposes dedicated branch-and-bound methods for the exact resolution of moderate-size, yet difficult, sparse optimization problems, through three possible formulations: cardinality-constrained and Cardinality-penalized least-squares, and cardinality minimization under quadratic constraints.
High-dimensional graphs and variable selection with the Lasso
- Computer Science
- 2006
It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.
Deconvolution of Point Sources: A Sampling Theorem and Robustness Guarantees
- Computer ScienceCommunications on Pure and Applied Mathematics
- 2018
This work analyzes a convex‐programming method for estimating superpositions of point sources or spikes from nonuniform samples of their convolution with a known kernel and derives theoretical guarantees on the robustness of the approach to both dense and sparse additive noise.
Sparse Convex Optimization via Adaptively Regularized Hard Thresholding
- Computer ScienceICML
- 2020
A new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT.