Applied Measure Theory for Probabilistic Modeling

@article{Scherrer2022AppliedMT,
  title={Applied Measure Theory for Probabilistic Modeling},
  author={Chad Scherrer and Moritz Schauer},
  journal={ArXiv},
  year={2022},
  volume={abs/2110.00602}
}
Probabilistic programming and statistical computing are vibrant ar- eas in the development of the Julia programming language, but the underlying infrastructure dramatically predates recent develop- ments. The goal of MeasureTheory.jl is to provide Julia with the right vocabulary and tools for these tasks. In the package we introduce a well-chosen set of notions from the foundations of probability together with powerful combinators and transforms, giving a gentle introduction to the concepts in… 

References

SHOWING 1-10 OF 23 REFERENCES

Stochastic lambda calculus and monads of probability distributions

TLDR
A translation of stochastic lambda calculus into measure terms is given, which can not only denote discrete probability distributions but can also support the best known modeling techniques.

The Base Measure Problem and its Solution

TLDR
It is proposed to solve the probabilistic programming problem by standardizing on Hausdorff measure as a base, and by deriving a formula and software architecture for updating densities with respect to Hausorf measure under diffeomorphic transformations.

Julia: A Fresh Approach to Numerical Computing

TLDR
The Julia programming language and its design is introduced---a dance between specialization and abstraction, which recognizes what remains the same after computation, and which is best left untouched as they have been built by the experts.

The Boomerang Sampler

TLDR
This paper introduces the Boomerang Sampler as a novel class of continuous-time non-reversible Markov chain Monte Carlo algorithms and demonstrates theoretically and empirically that it can out-perform existing benchmark piecewise deterministic Markov processes such as the bouncy particle sampler and the Zig-Zag.

Sticky PDMP samplers for sparse and local inference problems

TLDR
A new class of efficient Monte Carlo methods suitable for inference in high dimensional sparse models, i.e. models for which there is prior knowledge that many coordinates are likely to be exactly 0, is constructed.

A Language for Counterfactual Generative Models

TLDR
This work presents OMEGAC, a probabilistic programming language with support for counterfactual inference, by introducing a new operator to probabilism programming akin to Pearl’s do, and defining its formal semantics and implementation.

Probabilistic Inference by Program Transformation in Hakaru (System Description)

We present Hakaru, a new probabilistic programming system that allows composable reuse of distributions, queries, and inference algorithms, all expressed in a single language of measures. The system

Automatic Backward Filtering Forward Guiding for Markov processes and graphical models

TLDR
This work backpropagate the information provided by observations through the model to transform the generative (forward) model into a pre-conditional model guided by the data, which approximates the actual conditional model with known likelihood-ratio between the two.

Probability...

TLDR
This course can be used as a preparation for the first (Probability) actuarial exam and the central limit theorem and classical sampling distributions.

Generating random correlation matrices based on vines and extended onion method