• Publications
  • Influence
Stochastic variational inference
TLDR
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Removing Rain from Single Images via a Deep Detail Network
TLDR
A deep detail network is proposed to directly reduce the mapping range from input to output, which makes the learning process easier and significantly outperforms state-of-the-art methods on both synthetic and real-world images in terms of both qualitative and quantitative measures.
Clearing the Skies: A Deep Network Architecture for Single-Image Rain Removal
We introduce a deep network architecture called DerainNet for removing rain streaks from an image. Based on the deep convolutional neural network (CNN), we directly learn the mapping relationship
Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images
TLDR
Nonparametric Bayesian methods are considered for recovery of imagery based upon compressive, incomplete, and/or noisy measurements and significant improvements in image recovery are manifested using learned dictionaries, relative to using standard orthonormal image expansions.
Online Variational Inference for the Hierarchical Dirichlet Process
TLDR
This work proposes an online variational inference algorithm for the HDP, an algorithm that is easily applicable to massive and streaming data, and lets us analyze much larger data sets.
Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations
TLDR
The beta process is employed as a prior for learning the dictionary, and this non-parametric Bayesian method naturally infers an appropriate dictionary size, thereby allowing scaling to large images.
Nonparametric factor analysis with beta process priors
TLDR
This beta process factor analysis (BP-FA) model allows for a dataset to be decomposed into a linear combination of a sparse set of factors, providing information on the underlying structure of the observations.
PanNet: A Deep Network Architecture for Pan-Sharpening
TLDR
This work incorporates domain-specific knowledge to design the PanNet architecture by focusing on the two aims of the pan-sharpening problem: spectral and spatial preservation, and shows that the trained network generalizes well to images from different satellites without needing retraining.
Variational Bayesian Inference with Stochastic Search
TLDR
This work presents an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound and demonstrates the approach on two non-conjugate models: logistic regression and an approximation to the HDP.
TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency
In this paper, we propose TopicRNN, a recurrent neural network (RNN)-based language model designed to directly capture the global semantic meaning relating words in a document via latent topics.
...
1
2
3
4
5
...