# Probabilistic programming in Python using PyMC3

@article{Salvatier2016ProbabilisticPI, title={Probabilistic programming in Python using PyMC3}, author={John Salvatier and Thomas V. Wiecki and Christopher J Fonnesbeck}, journal={PeerJ Comput. Sci.}, year={2016}, volume={2}, pages={e55} }

Probabilistic Programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC) sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source Probabilistic Programming framework written in Python that uses Theano to compute gradients via automatic dierentiation as well as compile… Expand

#### Supplemental Code

Github Repo

Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Aesara

#### Paper Mentions

#### 1,129 Citations

Evaluating probabilistic programming and fast variational Bayesian inference in phylogenetics

- Medicine, Computer Science
- PeerJ
- 2019

It is shown that many commonly used phylogenetic models including the general time reversible substitution model, rate heterogeneity among sites, and a range of coalescent models can be implemented using a probabilistic programming language. Expand

Evaluating probabilistic programming and fast variational Bayesian inference in phylogenetics

- Biology, Computer Science
- 2019

It is shown that many commonly used phylogenetic models including the general time reversible (GTR) substitution model, rate heterogeneity among sites, and a range of coalescent models can be implemented using a probabilistic programming language. Expand

Probabilistic programming with programmable inference

- Computer Science
- PLDI
- 2018

Inference metaprogramming enables the concise expression of probabilistic models and inference algorithms across diverse elds, such as computer vision, data science, and robotics, within a single Probabilistic programming language. Expand

Probabilistic programming with programmable inference

- Computer Science
- Proceedings of the 39th ACM SIGPLAN Conference on Programming Language Design and Implementation
- 2018

Inference metaprogramming enables the concise expression of probabilistic models and inference algorithms across diverse elds, such as computer vision, data science, and robotics, within a single Probabilistic programming language. Expand

Pomegranate: fast and flexible probabilistic modeling in python

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2017

An overview of the design choices in pomegranate is presented, and how they have enabled complex features to be supported by simple code, making it competitive with---or outperform---other implementations of similar algorithms. Expand

ProBO: Versatile Bayesian Optimization Using Any Probabilistic Programming Language

- Computer Science
- 2019

ProBO is developed, a BO procedure that uses only standard operations common to most PPLs, and allows a user to drop in a model built with an arbitrary PPL and use it directly in BO. Expand

Hamiltonian Monte Carlo for Probabilistic Programs with Discontinuities

- Mathematics, Computer Science
- 2018

A Simple first-order Probabilistic Programming Language (SPPL) is designed that contains a sufficient set of language restrictions together with a compilation scheme that enables both the statistical and syntactic interpretation of if-else statements in the probabilistic program, within the scope of first- order PPLs. Expand

Expectation Programming

- Computer Science
- ArXiv
- 2021

A particular instantiation of the EPF concept is realized by extending the probabilistic programming language Turing to allow so-called target-aware inference to be run automatically, and it is shown that this leads to significant empirical gains compared to conventional posterior-based inference. Expand

ProBO: a Framework for Using Probabilistic Programming in Bayesian Optimization

- Computer Science, Mathematics
- ArXiv
- 2019

This paper develops ProBO, a framework for Bayesian optimization using only standard operations common to most Probabilistic programs (PPs), and introduces a model, which is term the Bayesian Product of Experts, that integrates into ProBO and can be used to combine information from multiple models implemented with different PPs. Expand

Reversible Jump Probabilistic Programming

- Computer Science
- AISTATS
- 2019

This paper presents a method for automatically deriving a Reversible Jump Markov chain Monte Carlo sampler from probabilistic programs that specify the target and proposal distributions, which relies on the interaction of several different components, including automatic differentiation, transformation inversion, and optimised code generation. Expand

#### References

SHOWING 1-10 OF 35 REFERENCES

A New Approach to Probabilistic Programming Inference

- Mathematics, Computer Science
- AISTATS
- 2014

A new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo that supports accurate inference in models that make use of complex control ow, including stochastic recursion is introduced. Expand

Venture: a higher-order probabilistic programming platform with programmable inference

- Computer Science, Mathematics
- ArXiv
- 2014

Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown. Expand

Church: a language for generative models

- Computer Science
- UAI
- 2008

This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. Expand

Automatic Variational Inference in Stan

- Computer Science, Mathematics
- NIPS
- 2015

An automatic variational inference algorithm, automatic differentiation Variational inference (ADVI), which is implemented in Stan, a probabilistic programming system and can be used on any model the authors write in Stan. Expand

Frequentism and Bayesianism: A Python-driven Primer

- Computer Science, Physics
- 2014

This paper presents a brief, semi-technical comparison of the essential features of the frequentist and Bayesian approaches to statistical inference, with several illustrative examples implemented in… Expand

Picture: A probabilistic programming language for scene perception

- Computer Science
- 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2015

Picture is presented, a probabilistic programming language for scene understanding that allows researchers to express complex generative vision models, while automatically solving them using fast general-purpose inference machinery. Expand

The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2014

The No-U-Turn Sampler (NUTS), an extension to HMC that eliminates the need to set a number of steps L, and derives a method for adapting the step size parameter {\epsilon} on the fly based on primal-dual averaging. Expand

Python reference manual

- Computer Science
- 1995

This reference manual describes the syntax and ``core semantics'' of the Python language, which is terse, but attempts to be exact and complete. Expand

Cython: The Best of Both Worlds

- Computer Science
- Computing in Science & Engineering
- 2011

Cython is a Python language extension that allows explicit type declarations and is compiled directly to C. As such, it addresses Python's large overhead for numerical loops and the difficulty of… Expand

The NumPy Array: A Structure for Efficient Numerical Computation

- Physics, Computer Science
- Computing in Science & Engineering
- 2011

This effort shows, NumPy performance can be improved through three techniques: vectorizing calculations, avoiding copying data in memory, and minimizing operation counts. Expand