• Publications
  • Influence
A New Approach to Probabilistic Programming Inference
TLDR
This paper, which originally appeared in the Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS) 2014, has been updated to reflect changes in the Anglican language. Expand
  • 264
  • 32
  • PDF
Canonical Correlation Forests
TLDR
We introduce canonical correlation forests (CCFs), a new tree ensemble method for classification where the individual canonical correlation trees (CCTs) use hyperplane splits based on the feature projections from a canonical correlation analysis. Expand
  • 56
  • 20
  • PDF
Fatty acid synthesis: a potential selective target for antineoplastic therapy.
OA-519 is a prognostic molecule found in tumor cells from breast cancer patients with markedly worsened prognosis. We purified OA-519 from human breast carcinoma cells, obtained its peptide sequence,Expand
  • 584
  • 16
Diagnosis code assignment: models and evaluation metrics
TLDR
We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Expand
  • 135
  • 16
  • PDF
Online Learning Rate Adaptation with Hypergradient Descent
TLDR
We introduce a general method for improving the convergence rate of gradient-based optimizers that is easy to implement and works well in practice. Expand
  • 83
  • 15
  • PDF
Learning Disentangled Representations with Semi-Supervised Deep Generative Models
TLDR
We propose to learn disentangled representations using model architectures that generalise from standard VAEs, employing a general graphical model structure in the encoder and decoder. Expand
  • 198
  • 13
  • PDF
Hierarchically Supervised Latent Dirichlet Allocation
TLDR
We introduce hierarchically supervised latent Dirichlet allocation (HSLDA), a model for hierarchically and multiply labeled bag-of-word data that has been, at least in part, manually categorized. Expand
  • 117
  • 11
  • PDF
Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints
TLDR
We study the semantic foundation of expressive probabilistic programming languages, that support higher-order functions, continuous distributions, and soft constraints (such as Anglican, Church, and Venture). Expand
  • 89
  • 9
  • PDF
Inference Compilation and Universal Probabilistic Programming
TLDR
We introduce a method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages, establishing a framework that combines the strengths of probabilism programming and deep learning methods. Expand
  • 94
  • 9
  • PDF
A stochastic memoizer for sequence data
TLDR
We propose an unbounded-depth, hierarchical, Bayesian nonparametric model for discrete sequence data that can be represented in time and space linear in the length of the training sequence. Expand
  • 109
  • 8
  • PDF