• Publications
  • Influence
New and Improved Johnson-Lindenstrauss Embeddings via the Restricted Isometry Property
TLDR
We show that by randomizing the column signs of such a matrix Phi, the resulting map with high probability embeds any fixed set of p = O(e^k) points in R^N into R^m without distorting the norm of any point in the set by more than a factor of 1 +- delta. Expand
  • 250
  • 24
  • PDF
Stable and Robust Sampling Strategies for Compressive Imaging
TLDR
In many signal processing applications, one wishes to acquire images that are sparse in transform domains such as spatial finite differences or wavelets using frequency domain samples. Expand
  • 135
  • 22
  • PDF
Sparse Legendre expansions via l1-minimization
TLDR
We show that a Legendre s-sparse polynomial of maximal degree N can be recovered from [email protected]?slog^4(N) random samples that are chosen independently according to the Chebyshev probability measure. As an efficient recovery method, @?"1-minimization can be used. Expand
  • 163
  • 16
  • PDF
Stable Image Reconstruction Using Total Variation Minimization
TLDR
This paper presents near-optimal guarantees for stable and robust image recovery from undersampled noisy measurements using total variation minimization. Expand
  • 191
  • 15
  • PDF
Completing any low-rank matrix, provably
TLDR
Matrix completion, i.e., the exact and provable recovery of a low rank matrix from a small subset of its elements, is currently only known to be possible if the matrix satisfies a restrictive structural constraint---known as {\em incoherence}---on its row and column spaces. Expand
  • 68
  • 15
  • PDF
Volume 10
  • 89
  • 15
Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization
TLDR
We present and analyze an efficient implementation of an iteratively reweighted least squares algorithm for recovering a matrix from a small number of linear measurements with an error of the order of the best $k$-rank approximation. Expand
  • 172
  • 14
  • PDF
Relax, No Need to Round: Integrality of Clustering Formulations
TLDR
We study exact recovery conditions for convex relaxations of point cloud clustering problems, focusing on two of the most common optimization problems for unsupervised clustering: k-means and k-median clustering. Expand
  • 83
  • 13
  • PDF
Interpolation via weighted $l_1$ minimization
Functions of interest are often smooth and sparse in some sense, and both priors should be taken into account when interpolating sampled data. Classical linear interpolation methods are effectiveExpand
  • 106
  • 12
  • PDF
Clustering subgaussian mixtures by semidefinite programming
TLDR
We introduce a model-free relax-and-round algorithm for k-means clustering based on a semidefinite relaxation due to Peng and Wei. Expand
  • 57
  • 10
  • PDF