The use of Bayesian methods in large-scale data settings is attractive because of the rich hierarchical models, uncertainty quantification, and prior specification they provide. Standard Bayesian… (More)

Markov jump processes (MJPs) are used to model a wide range of phenomena from disease progression to RNA path folding. However, maximum likelihood estimation of parametric models leads to degenerate… (More)

TREVOR CAMPBELL, JONATHAN H. HUGGINS, JONATHAN P. HOW3,†, and TAMARA BRODERICK2,‡ First authorship is shared jointly by T. Campbell and J. H. Huggins. Computer Science and Artificial Intelligence… (More)

Common statistical practice has shown that the full power of Bayesian methods is not realized until hierarchical priors are used, as these allow for greater “robustness” and the ability to “share… (More)

Generalized linear models (GLMs)—such as logistic regression, Poisson regression, and robust regression—provide interpretable models for diverse data types. Probabilistic approaches, particularly… (More)

To obtain the SVA objective from the parametric MJP model, we begin by scaling the exponential distribution f(t;λ) = λ exp(−λt), which is an exponential family distribution with natural parameter η =… (More)

Proposition. The Bayesian cumulative loss is bounded as LBayes(ZT ) ≤ LQ(ZT ) + KL(Q||P0). (A.1) Proof of Theorem 2.4. Fix a choice of θ∗ and φ and write Q = Qθ∗,φ. Take a second-order Taylor… (More)

This paper formalizes a latent variable inference problem we call supervised pattern discovery, the goal of which is to find sets of observations that belong to a single “pattern.” We discuss two… (More)

Gaussian processes (GPs) offer a flexible class of priors for nonparametric Bayesian regression, but popular GP posterior inference methods are typically prohibitively slow or lack desirable… (More)

This paper formalizes a latent variable inference problem we call supervised pattern discovery, the goal of which is to find sets of observations that belong to a single “pattern.” We discuss two… (More)