Support Exploration Algorithm for Sparse Support Recovery

  title={Support Exploration Algorithm for Sparse Support Recovery},
  author={Mimoun Mohamed and Franccois Malgouyres and Valentin Emiya and Caroline Chaux},

Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints

This work studies the problem of minimizing the expected loss of a linear predictor while constraining its sparsity, i.e., bounding the number of features used by the predictor, and analyze the performance of several approximation algorithms.

Subspace Pursuit for Compressive Sensing Signal Reconstruction

The presented analysis shows that in the noiseless setting, the proposed algorithm can exactly reconstruct arbitrary sparse signals provided that the sensing matrix satisfies the restricted isometry property with a constant parameter.

Orthogonal Matching Pursuit with Replacement

This paper proposes a novel partial hard-thresholding operator that leads to a general family of iterative algorithms that includes Orthogonal Matching Pursuit with Replacement (OMPR), and extends OMPR using locality sensitive hashing to get OMPR-Hash, the first provably sub-linear algorithm for sparse recovery.

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise

  • T. CaiLie Wang
  • Computer Science
    IEEE Transactions on Information Theory
  • 2011
It is shown that under conditions on the mutual incoherence and the minimum magnitude of the nonzero components of the signal, the support of the signals can be recovered exactly by the OMP algorithm with high probability.

CoSaMP: Iterative signal recovery from incomplete and inaccurate samples

This extended abstract describes a recent algorithm, called, CoSaMP, that accomplishes the data recovery task and was the first known method to offer near-optimal guarantees on resource usage.

Exact Recovery of Hard Thresholding Pursuit

This paper shows, for the first time, that exact recovery of the global sparse minimizer is possible for HTP-style methods under restricted strong condition number bounding conditions and is able to recover the support of certain relaxed sparse solutions without assuming bounded restrictedStrong condition number.

Global optimization for sparse solution of least squares problems

This work proposes dedicated branch-and-bound methods for the exact resolution of moderate-size, yet difficult, sparse optimization problems, through three possible formulations: cardinality-constrained and Cardinality-penalized least-squares, and cardinality minimization under quadratic constraints.

High-dimensional graphs and variable selection with the Lasso

It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

Deconvolution of Point Sources: A Sampling Theorem and Robustness Guarantees

This work analyzes a convex‐programming method for estimating superpositions of point sources or spikes from nonuniform samples of their convolution with a known kernel and derives theoretical guarantees on the robustness of the approach to both dense and sparse additive noise.

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

A new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT.