• Publications
  • Influence
Model-Based Compressive Sensing
TLDR
This paper introduces a model-based CS theory that parallels the conventional theory and provides concrete guidelines on how to create model- based recovery algorithms with provable performance guarantees. Expand
  • 1,515
  • 131
  • PDF
Bilinear Generalized Approximate Message Passing—Part I: Derivation
TLDR
In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. Expand
  • 128
  • 22
  • PDF
Sparse Signal Recovery Using Markov Random Fields
TLDR
We extend the theory of Compressive Sensing (CS) to include signals that are concisely represented in terms of a graphical model. Expand
  • 192
  • 20
  • PDF
A compressive beamforming method
TLDR
We show that by using random projections of the sensor data, along with a full waveform recording on one reference sensor, a sparse angle space scenario can be reconstructed, giving the number of sources and their DOA's. Expand
  • 151
  • 13
  • PDF
Convex Optimization for Big Data: Scalable, randomized, and parallel algorithms for big data analytics
TLDR
This article reviews recent advances in convex optimization algorithms for big data, which aim to reduce the computational, storage, and communications bottlenecks. Expand
  • 246
  • 12
  • PDF
Submodular Dictionary Selection for Sparse Representation
TLDR
We develop an efficient learning framework to construct signal dictionaries for sparse representation by selecting the dictionary columns from multiple candidate bases. Expand
  • 121
  • 11
  • PDF
Time-Varying Gaussian Process Bandit Optimization
TLDR
We consider the sequential Bayesian optimization problem with bandit feedback, adopting a formulation that allows for the reward function to vary with time. Expand
  • 36
  • 11
  • PDF
Compressive Sensing for Background Subtraction
TLDR
We propose a method to directly recover background subtracted images using CS and discuss its applications in some communication constrained multi-camera computer vision problems. Expand
  • 304
  • 10
  • PDF
Learning with Compressible Priors
  • V. Cevher
  • Mathematics, Computer Science
  • NIPS
  • 7 December 2009
TLDR
We describe a set of probability distributions, dubbed compressible priors, whose independent and identically distributed (iid) realizations result in p-compressible signals. Expand
  • 122
  • 10
  • PDF
WASP: Scalable Bayes via barycenters of subset posteriors
TLDR
We propose a simple, general, and highly ecient approach, which first runs a posterior sampling algorithm in parallel on dierent machines for subsets of a large data set. Expand
  • 91
  • 9
  • PDF