• Corpus ID: 116983370

Bayesian machine learning via category theory

  title={Bayesian machine learning via category theory},
  author={Jared Culbertson and Kirk Sturtz},
  journal={arXiv: Category Theory},
From the Bayesian perspective, the category of conditional probabilities (a variant of the Kleisli category of the Giry monad, whose objects are measurable spaces and arrows are Markov kernels) gives a nice framework for conceptualization and analysis of many aspects of machine learning. Using categorical methods, we construct models for parametric and nonparametric Bayesian reasoning on function spaces, thus providing a basis for the supervised learning problem. In particular, stochastic… 
Categorical Stochastic Processes and Likelihood
  • Dan Shiebler
  • Mathematics, Computer Science
  • 2020
A category-theoretic perspective on the relationship between probabilistic modeling and gradient based optimization is taken and a way to compose the likelihood functions of these models is defined.
Characterizing the invariances of learning algorithms using category theory
  • K. Harris
  • Mathematics, Computer Science
  • 2019
The framework for an invariant learning algorithm is a natural transformation between two functors from the product of these categories to the category of sets, representing training datasets and learned functions respectively.
Information flow in context-dependent hierarchical Bayesian inference
A novel and general approach to the characterisation of contextuality using the techniques of Chu spaces and Channel Theory viewed as general theories of information flow is introduced, which relates contextuality to the Frame Problem, another way of characterising a fundamental limitation on the observational and inferential capabilities of finite agents.
Compositional Deep Learning
This thesis builds a category-theoretic formalism around a class of neural networks exemplified by CycleGAN, and uses the framework to conceive a novel neural network architecture whose goal is to learn the task of object insertion and object deletion in images with unpaired data.
Formal verification of higher-order probabilistic programs: reasoning about approximation, convergence, Bayesian inference, and optimization
A suite of logics, collectively named PPV for proving properties of programs written in an expressive probabilistic higher-order language with continuous sampling operations and primitives for conditioning distributions, and shows expressiveness by giving sound embeddings of existing logics.
A free energy principle for generic quantum systems.
Apprentissage par analogies grâce à des outils de la théorie des catégories(Learning through analogies using tools from category theory)
This work proposes a machine learning system that accumulates knowledge, then transposes it to new configurations, which can be attained with concepts borrowed from Cognitive Science and Social Science, and formalized mathematically using tools from Category Theory and Control Theory.
Category Theory in Machine Learning
This work aims to document the motivations, goals and common themes across these applications of category theory in machine learning, touching on gradient-based learning, probability, and equivariant learning.
Searching for Topological Symmetry in Data Haystack
A new method to find local symmetries in a low-dimensional 2-D grid structure which is embedded in high-dimensional structure and computed and analyzed the grid symmetry of data on multivariate Gaussian distributions and Gamma distributions with noise.
Quantum GestART: identifying and applying correlations between mathematics, art, and perceptual organization
A method based on diagrammatic thinking and quantum formalism is proposed, exploiting decompositions of complex forms into a set of simple shapes, discretization of complex images, and Dirac notation, imagining a world of “prototypes” that can be connected to obtain a fine or coarse-graining approximation of a given visual image.


A Categorical Foundation for Bayesian Probability
An existence theorem for regular conditional probabilities by Faden is used, which holds in more generality than the standard setting of Polish spaces, which allows for non-trivial decision rules on finite (as well as non finite) spaces.
Causal Theories: A Categorical Perspective on Bayesian Networks
In this dissertation we develop a new formal graphical framework for causal reasoning. Starting with a review of monoidal categories and their associated graphical languages, we then revisit
Picturing classical and quantum Bayesian inference
We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference
A Categorical Approach to Probability Theory
This work shows that the category ID of D-posets of fuzzy sets and sequentially continuous D-homomorphisms allows to characterize the passage from classical to fuzzy events as the minimal generalization having nontrivial quantum character.
Gaussian Processes for Machine Learning
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Machine learning - a probabilistic perspective
  • K. Murphy
  • Computer Science
    Adaptive computation and machine learning series
  • 2012
This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Real Analysis and Probability
1. Foundations: set theory 2. General topology 3. Measures 4. Integration 5. Lp spaces: introduction to functional analysis 6. Convex sets and duality of normed spaces 7. Measure, topology, and
Bayesian reasoning and machine learning
Comprehensive and coherent, this hands-on text develops everything from basic reasoning to advanced techniques within the framework of graphical models, and develops analytical and problem-solving skills that equip them for the real world.
Kleisli morphisms and randomized congruences for the Giry monad