• Publications
  • Influence
Pyro: Deep Universal Probabilistic Programming
Pyro uses stochastic variational inference algorithms and probability distributions built on top of PyTorch, a modern GPU-accelerated deep learning framework to accommodate complex or model-specific algorithmic behavior.
Conditional Similarity Networks
This work proposes Conditional Similarity Networks (CSNs) that learn embeddings differentiated into semantically distinct subspaces that capture the different notions of similarities.
Bayesian representation learning with oracle constraints
It is shown how implicit triplet information can provide rich information to learn representations that outperform previous metric learning approaches as well as generative models without this side-information in a variety of predictive tasks.
RiboDiff: detecting changes of mRNA translation efficiency from ribosome footprints
A statistical framework and an analysis tool are presented, RiboDiff, to detect genes with changes in translation efficiency across experimental treatments and performs a statistical test for differential translation efficiency using both mRNA abundance and ribosome occupancy.
Adversarial Message Passing For Graphical Models
This work treats GANs as a basis for likelihood-free inference in generative models and generalizes them to Bayesian posterior inference over factor graphs, finding that Bayesian inference on structured models can be performed only with sampling and discrimination when using nonparametric variational families, without access to explicit distributions.
Likelihood-free inference with emulator networks
This work presents a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data -- both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data.
Generalized Hidden Parameter MDPs Transferable Model-based RL in a Handful of Trials
The GHP-MDP augments model-based RL with latent variables that capture these hidden parameters, facilitating transfer across tasks and explores a variant of the model that incorporates explicit latent structure mirroring the causal factors of variation across tasks (for instance: agent properties, environmental factors, and goals).
An Empirical Analysis of Topic Modeling for Mining Cancer Clinical Notes
Using a variety of techniques including Topic Modeling, Principal Component Analysis and Bi-clustering, we explore electronic patient records in the form of unstructured clinical notes and genetic
Disentangling Nonlinear Perceptual Embeddings With Multi-Query Triplet Networks
This paper proposes Multi-Query Networks (MQNs) that leverage recent advances in representation learning on factorized triplet embeddings in combination with Convolutional Networks in order to learn embeddeddings differentiated into semantically distinct subspaces, which are learned with a latent space attention mechanism.
Probabilistic Meta-Representations Of Neural Networks
This work considers a richer prior distribution in which units in the network are represented by latent variables, and the weights between units are drawn conditionally on the values of the collection of those variables.