This paper introduces a model-based CS theory that parallels the conventional theory and provides concrete guidelines on how to create model- based recovery algorithms with provable performance guarantees.Expand

In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems.Expand

We show that by using random projections of the sensor data, along with a full waveform recording on one reference sensor, a sparse angle space scenario can be reconstructed, giving the number of sources and their DOA's.Expand

This article reviews recent advances in convex optimization algorithms for big data, which aim to reduce the computational, storage, and communications bottlenecks.Expand

We develop an efficient learning framework to construct signal dictionaries for sparse representation by selecting the dictionary columns from multiple candidate bases.Expand

We consider the sequential Bayesian optimization problem with bandit feedback, adopting a formulation that allows for the reward function to vary with time.Expand

We propose a method to directly recover background subtracted images using CS and discuss its applications in some communication constrained multi-camera computer vision problems.Expand

We describe a set of probability distributions, dubbed compressible priors, whose independent and identically distributed (iid) realizations result in p-compressible signals.Expand

We propose a simple, general, and highly ecient approach, which first runs a posterior sampling algorithm in parallel on dierent machines for subsets of a large data set.Expand