Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Pyro: Deep Universal Probabilistic Programming
- Eli Bingham, Jonathan P. Chen, +7 authors Noah D. Goodman
- Computer Science, MathematicsJ. Mach. Learn. Res.
- 18 October 2018
Pyro uses stochastic variational inference algorithms and probability distributions built on top of PyTorch, a modern GPU-accelerated deep learning framework to accommodate complex or model-specific algorithmic behavior.
Conditional Similarity Networks
- Andreas Veit, Serge J. Belongie, Theofanis Karaletsos
- Computer ScienceIEEE Conference on Computer Vision and Pattern…
- 25 March 2016
This work proposes Conditional Similarity Networks (CSNs) that learn embeddings differentiated into semantically distinct subspaces that capture the different notions of similarities.
Bayesian representation learning with oracle constraints
- Theofanis Karaletsos, Serge J. Belongie, Gunnar Rätsch
- Mathematics, Computer ScienceICLR
- 16 June 2015
It is shown how implicit triplet information can provide rich information to learn representations that outperform previous metric learning approaches as well as generative models without this side-information in a variety of predictive tasks.
RiboDiff: detecting changes of mRNA translation efficiency from ribosome footprints
- Y. Zhong, Theofanis Karaletsos, +5 authors G. Rätsch
- Computer Science, MedicineBioinform.
- 14 September 2016
A statistical framework and an analysis tool are presented, RiboDiff, to detect genes with changes in translation efficiency across experimental treatments and performs a statistical test for differential translation efficiency using both mRNA abundance and ribosome occupancy.
Adversarial Message Passing For Graphical Models
- Theofanis Karaletsos
- Mathematics, Computer ScienceArXiv
- 15 December 2016
This work treats GANs as a basis for likelihood-free inference in generative models and generalizes them to Bayesian posterior inference over factor graphs, finding that Bayesian inference on structured models can be performed only with sampling and discrimination when using nonparametric variational families, without access to explicit distributions.
Likelihood-free inference with emulator networks
- Jan-Matthis Lueckmann, G. Bassetto, Theofanis Karaletsos, J. Macke
- Computer Science, MathematicsAABI
- 23 May 2018
This work presents a new ABC method which uses probabilistic neural emulator networks to learn synthetic likelihoods on simulated data -- both local emulators which approximate the likelihood for specific observed data, as well as global ones which are applicable to a range of data.
Generalized Hidden Parameter MDPs Transferable Model-based RL in a Handful of Trials
The GHP-MDP augments model-based RL with latent variables that capture these hidden parameters, facilitating transfer across tasks and explores a variant of the model that incorporates explicit latent structure mirroring the causal factors of variation across tasks (for instance: agent properties, environmental factors, and goals).
An Empirical Analysis of Topic Modeling for Mining Cancer Clinical Notes
- Katherine Redfield Chan, Xinghua Lou, +4 authors G. Rätsch
- Computer ScienceIEEE 13th International Conference on Data Mining…
- 7 December 2013
Using a variety of techniques including Topic Modeling, Principal Component Analysis and Bi-clustering, we explore electronic patient records in the form of unstructured clinical notes and genetic…
Disentangling Nonlinear Perceptual Embeddings With Multi-Query Triplet Networks
This paper proposes Multi-Query Networks (MQNs) that leverage recent advances in representation learning on factorized triplet embeddings in combination with Convolutional Networks in order to learn embeddeddings differentiated into semantically distinct subspaces, which are learned with a latent space attention mechanism.
Probabilistic Meta-Representations Of Neural Networks
This work considers a richer prior distribution in which units in the network are represented by latent variables, and the weights between units are drawn conditionally on the values of the collection of those variables.