• Publications
  • Influence
Bayesian Compressive Sensing
TLDR
The data of interest are assumed to be represented as N-dimensional real vectors, and these vectors are compressible in some linear basis B, implying that the signal can be reconstructed accurately using only a small number M of basis-function coefficients associated with B. Expand
  • 1,774
  • 231
  • PDF
Probabilistic Topic Models
TLDR
We review probabilistic topic models: graphical models that can be used to summarize a large collection of documents with a smaller number of distributions over words. Expand
  • 1,039
  • 137
  • PDF
Sparse multinomial logistic regression: fast algorithms and generalization bounds
TLDR
We derive fast exact algorithms for learning sparse multiclass classifiers that scale favorably in both the number of training samples and feature dimensionality, making them applicable even to large data sets in high-dimensional feature spaces. Expand
  • 844
  • 110
  • PDF
Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images
TLDR
Nonparametric Bayesian methods are considered for recovery of imagery based upon compressive, incomplete, and/or noisy measurements. Expand
  • 347
  • 55
  • PDF
Multi-Task Learning for Classification with Dirichlet Process Priors
TLDR
We investigate the problem of learning logistic-regression models for multiple classification tasks, where the training data set for each task is not drawn from the same statistical distribution. Expand
  • 515
  • 46
  • PDF
Multitask Compressive Sensing
TLDR
This paper has analyzed the problem of simultaneous inversion of multiple related signals to enhance the CS reconstructions. Expand
  • 395
  • 45
Exploiting Structure in Wavelet-Based Bayesian Compressive Sensing
  • L. He, L. Carin
  • Mathematics, Computer Science
  • IEEE Transactions on Signal Processing
  • 1 September 2009
TLDR
A new statistical model has been developed for Bayesian inverse compressive sensing (CS), for situations in which the signal of interest is compressible in a wavelet basis. Expand
  • 438
  • 45
  • PDF
Nonparametric factor analysis with beta process priors
TLDR
We propose a nonparametric extension to the factor analysis problem using a beta process prior. Expand
  • 268
  • 35
  • PDF
Bayesian Robust Principal Component Analysis
TLDR
A hierarchical Bayesian model is considered for decomposing a matrix into low-rank and sparse components, assuming the observed matrix is a superposition of the two. Expand
  • 236
  • 33
  • PDF
Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations
TLDR
Non-parametric Bayesian techniques are considered for learning dictionaries for sparse image representations, with applications in denoising, inpainting and compressive sensing (CS). Expand
  • 251
  • 32
  • PDF