Categorical Stochastic Processes and Likelihood

  title={Categorical Stochastic Processes and Likelihood},
  author={Dan Shiebler},
  • Dan Shiebler
  • Published 10 May 2020
  • Mathematics, Computer Science
  • ArXiv
We take a category-theoretic perspective on the relationship between probabilistic modeling and gradient based optimization. We define two extensions of function composition to stochastic process subordination: one based on a co-Kleisli category and one based on the parameterization of a category with a Lawvere theory. We show how these extensions relate to the category of Markov kernels Stoch through a pushforward procedure.We extend stochastic processes to parametric statistical models and… 

Tables from this paper

AlgebraicSystems: Compositional Verification for Autonomous System Design
AlgebraicSystems is proposed, a conglomeration of algebraic methods to assign semantics and categorical primitives to give computational meaning to relationships between models so that the formalisms and resulting tools are interoperable through vertical and horizontal composition.
Category Theory in Machine Learning
This work aims to document the motivations, goals and common themes across these applications of category theory in machine learning, touching on gradient-based learning, probability, and equivariant learning.


Compositional Deep Learning
This thesis builds a category-theoretic formalism around a class of neural networks exemplified by CycleGAN, and uses the framework to conceive a novel neural network architecture whose goal is to learn the task of object insertion and object deletion in images with unpaired data.
Causal Theories: A Categorical Perspective on Bayesian Networks
In this dissertation we develop a new formal graphical framework for causal reasoning. Starting with a review of monoidal categories and their associated graphical languages, we then revisit
Backprop as Functor: A compositional perspective on supervised learning
A key contribution is the notion of request function, which provides a structural perspective on backpropagation, giving a broad generalisation of neural networks and linking it with structures from bidirectional programming and open games.
Conditional Independence
This article has been prepared as an entry for the Wiley Encyclopedia of Statistical Sciences (Update). It gives a brief overview of fundamental properties and applications of conditional
A convenient category for higher-order probability theory
This work demonstrates the use of quasi-Borel spaces for higher-order functions and probability by showing that a well-known construction of probability theory involving random functions gains a cleaner expression; and generalizing de Finetti's theorem, that is a crucial theorem in probability theory, to quasi- Bortl spaces.
and Rmy Tuyras
  • Backprop as functor: A compositional perspective on supervised learning,
  • 2017
  • abs/1907.08292,
  • 2019
Categorial Lévy Processes
We generalize Franz’ independence in tensor categories with inclusions from two morphisms (which represent generalized random variables) to arbitrary ordered families of morphisms. We will see that
Categorical Foundations of Gradient-Based Learning
A categorical foundation of gradientbased machine learning algorithms in terms of lenses, parametrised maps, and reverse derivative categories is proposed, which encompasses a variety of gradient descent algorithms such as ADAM, AdaGrad, and Nesterov momentum.