• Corpus ID: 235422495

Category Theory in Machine Learning

@article{Shiebler2021CategoryTI,
  title={Category Theory in Machine Learning},
  author={Dan Shiebler and Bruno Gavranovi'c and Paul Wilson},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.07032}
}
Over the past two decades machine learning has permeated almost every realm of technology. At the same time, many researchers have begun using category theory as a unifying language, facilitating communication between different scientific disciplines. It is therefore unsurprising that there is a burgeoning interest in applying category theory to machine learning. We aim to document the motivations, goals and common themes across these applications. We touch on gradient-based learning… 

Figures and Tables from this paper

Kan Extensions in Data Science and Machine Learning
TLDR
This work deriving a simple classification algorithm as a Kan extension and experimenting with this algorithm on real data, and investigating how Kan extensions can be used to learn a general mapping from datasets of labeled examples to functions and to approximate a complex function with a simpler one.
Compositionality as we see it, everywhere around us
TLDR
Inspired by work on compositionality in quantum theory, and categorical quantum mechanics in particular, the notions of Schrödinger, Whitehead, and complete compositionality are proposed, which aim to capture the fact that compositionality is at its best when it is ‘real’, ‘non-trivial’ and even more when it also is “complete’.

References

SHOWING 1-10 OF 119 REFERENCES
Categorical Foundations of Gradient-Based Learning
TLDR
A categorical foundation of gradientbased machine learning algorithms in terms of lenses, parametrised maps, and reverse derivative categories is proposed, which encompasses a variety of gradient descent algorithms such as ADAM, AdaGrad, and Nesterov momentum.
Compositional Deep Learning
TLDR
This thesis builds a category-theoretic formalism around a class of neural networks exemplified by CycleGAN, and uses the framework to conceive a novel neural network architecture whose goal is to learn the task of object insertion and object deletion in images with unpaired data.
Learning Functors using Gradient Descent
TLDR
A category-theoretic formalism around a neural network system called CycleGAN, a general approach to unpaired image-to-image translation that has been getting attention in the recent years, is built and it is shown that enforcing cycle-consistencies amounts to enforcing composition invariants in this category.
What is Applied Category Theory
This is a collection of introductory, expository notes on applied category theory, inspired by the 2018 Applied Category Theory Workshop, and in these notes we take a leisurely stroll through two
Reverse Derivative Ascent: A Categorical Approach to Learning Boolean Circuits
TLDR
Reverse Derivative Ascent is introduced: a categorical analogue of gradient based methods for machine learning that allows us to learn the parameters of boolean circuits directly, in contrast to existing binarised neural network approaches.
Characterizing the invariances of learning algorithms using category theory
  • K. Harris
  • Mathematics, Computer Science
    ArXiv
  • 2019
TLDR
The framework for an invariant learning algorithm is a natural transformation between two functors from the product of these categories to the category of sets, representing training datasets and learned functions respectively.
Topological Methods for Unsupervised Learning
TLDR
The languages of topology and category theory are used to provide a unified mathematical approach to these three major problems in unsupervised learning: dimension reduction; clustering; and anomaly detection.
Fighting Redundancy and Model Decay with Embeddings
TLDR
The commoditized tools, algorithms and pipelines that Twitter has developed and are developing at Twitter to regularly generate high quality, up-to-date embeddings and share them broadly across the company are detailed.
Natural Graph Networks
Conventional neural message passing algorithms are invariant under permutation of the messages and hence forget how the information flows through the network. Studying the local symmetries of graphs,
Categorical Stochastic Processes and Likelihood
  • Dan Shiebler
  • Mathematics, Computer Science
    Compositionality
  • 2020
TLDR
A category-theoretic perspective on the relationship between probabilistic modeling and gradient based optimization is taken and a way to compose the likelihood functions of these models is defined.
...
...