Corpus ID: 199452940

Functional probabilistic programming for scalable Bayesian modelling

@article{Law2019FunctionalPP,
  title={Functional probabilistic programming for scalable Bayesian modelling},
  author={Jonathan Law and Darren J. Wilkinson},
  journal={arXiv: Computation},
  year={2019}
}
  • J. Law, D. Wilkinson
  • Published 6 August 2019
  • Computer Science, Mathematics
  • arXiv: Computation
Bayesian inference involves the specification of a statistical model by a statistician or practitioner, with careful thought about what each parameter represents. This results in particularly interpretable models which can be used to explain relationships present in the observed data. Bayesian models are useful when an experiment has only a small number of observations and in applications where transparency of data driven decisions is important. Traditionally, parameter inference in Bayesian… Expand

Figures from this paper

Automatic Backward Filtering Forward Guiding for Markov processes and graphical models
TLDR
This work backpropagate the information provided by observations through the model to transform the generative (forward) model into a pre-conditional model guided by the data, which approximates the actual conditional model with known likelihood-ratio between the two. Expand

References

SHOWING 1-10 OF 54 REFERENCES
Stan: A Probabilistic Programming Language
TLDR
Stan is a probabilistic programming language for specifying statistical models that provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler and an adaptive form of Hamiltonian Monte Carlo sampling. Expand
Functional programming for modular Bayesian inference
TLDR
An architectural design of a library for Bayesian modelling and inference in modern functional programming languages that enables deterministic testing of inherently stochastic Monte Carlo algorithms is presented and it is demonstrated using OCaml that an expressive module system can also implement the design. Expand
Automatic Differentiation Variational Inference
TLDR
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models. Expand
Denotational validation of higher-order Bayesian inference
TLDR
A modular semantic account of Bayesian inference algorithms for probabilistic programming languages, as used in data science and machine learning, is presented and Kock's synthetic measure theory is used to emphasize the connection between the semantic manipulation and its traditional measure theoretic origins. Expand
Practical probabilistic programming with monads
TLDR
This work uses a GADT as an underlying representation of a probability distribution and applies Sequential Monte Carlo-based methods to achieve efficient inference, and defines a formal semantics via measure theory. Expand
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods andExpand
WinBUGS - A Bayesian modelling framework: Concepts, structure, and extensibility
TLDR
How and why various modern computing concepts, such as object-orientation and run-time linking, feature in the software's design are discussed and how the framework may be extended. Expand
Commutative Semantics for Probabilistic Programming
TLDR
It is shown that probabilistic programs are in fact commutative, by characterizing the measures/kernels that arise from programs as 's-finite', i.e. sums of finite measures/ kernels. Expand
JAGS: A program for analysis of Bayesian graphical models using Gibbs sampling
JAGS is a program for Bayesian Graphical modelling which aims for compatibility with Classic BUGS. The program could eventually be developed as an R package. This article explains the motivations forExpand
The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
TLDR
The No-U-Turn Sampler (NUTS), an extension to HMC that eliminates the need to set a number of steps L, and derives a method for adapting the step size parameter {\epsilon} on the fly based on primal-dual averaging. Expand
...
1
2
3
4
5
...