• Publications
  • Influence
Bayesian Compressive Sensing
TLDR
The underlying theory, an associated algorithm, example results, and comparisons to other compressive-sensing inversion algorithms in the literature are presented.
Sparse multinomial logistic regression: fast algorithms and generalization bounds
TLDR
This paper introduces a true multiclass formulation based on multinomial logistic regression and derives fast exact algorithms for learning sparse multiclass classifiers that scale favorably in both the number of training samples and the feature dimensionality, making them applicable even to large data sets in high-dimensional feature spaces.
Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images
TLDR
Nonparametric Bayesian methods are considered for recovery of imagery based upon compressive, incomplete, and/or noisy measurements and significant improvements in image recovery are manifested using learned dictionaries, relative to using standard orthonormal image expansions.
Multitask Compressive Sensing
TLDR
It has been demonstrated that with appropriate design of the compressive measurements used to define v, the decompressive mapping vrarru may be performed with error with asymptotic properties analogous to those of the best adaptive transform-coding algorithm applied in the basis Psi.
Exploiting Structure in Wavelet-Based Bayesian Compressive Sensing
  • L. He, L. Carin
  • Computer Science
    IEEE Transactions on Signal Processing
  • 1 September 2009
TLDR
A hierarchical Bayesian model is constituted, with efficient inference via Markov chain Monte Carlo (MCMC) sampling, with performance comparisons to many state-of-the-art compressive-sensing inversion algorithms.
Multi-Task Learning for Classification with Dirichlet Process Priors
TLDR
Experimental results on two real life MTL problems indicate that the proposed algorithms automatically identify subgroups of related tasks whose training data appear to be drawn from similar distributions are more accurate than simpler approaches such as single-task learning, pooling of data across all tasks, and simplified approximations to DP.
Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks
TLDR
This work proposes combining adaptive preconditioners with Stochastic Gradient Langevin Dynamics, and gives theoretical properties on asymptotic convergence and predictive risk, and empirical results for Logistic Regression, Feedforward Neural Nets, and Convolutional Neural Nets demonstrate that the preconditionsed SGLD method gives state-of-the-art performance.
Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations
TLDR
The beta process is employed as a prior for learning the dictionary, and this non-parametric Bayesian method naturally infers an appropriate dictionary size, thereby allowing scaling to large images.
Semantic Compositional Networks for Visual Captioning
  • Zhe Gan, Chuang Gan, L. Deng
  • Computer Science
    IEEE Conference on Computer Vision and Pattern…
  • 23 November 2016
TLDR
Experimental results show that the proposed method significantly outperforms prior state-of-the-art approaches, across multiple evaluation metrics.
Bayesian Robust Principal Component Analysis
TLDR
The Bayesian framework infers an approximate representation for the noise statistics while simultaneously inferring the low-rank and sparse-outlier contributions; the model is robust to a broad range of noise levels, without having to change model hyperparameter settings.
...
...