• Corpus ID: 146121288

Characterizing the invariances of learning algorithms using category theory

@article{Harris2019CharacterizingTI,
  title={Characterizing the invariances of learning algorithms using category theory},
  author={Kenneth D. Harris},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.02072}
}
  • K. Harris
  • Published 6 May 2019
  • Mathematics, Computer Science
  • ArXiv
Many learning algorithms have invariances: when their training data is transformed in certain ways, the function they learn transforms in a predictable manner. Here we formalize this notion using concepts from the mathematical field of category theory. The invariances that a supervised learning algorithm possesses are formalized by categories of predictor and target spaces, whose morphisms represent the algorithm's invariances, and an index category whose morphisms represent permutations of the… 
Compositional Deep Learning
TLDR
This thesis builds a category-theoretic formalism around a class of neural networks exemplified by CycleGAN, and uses the framework to conceive a novel neural network architecture whose goal is to learn the task of object insertion and object deletion in images with unpaired data.
Category Theory in Machine Learning
TLDR
This work aims to document the motivations, goals and common themes across these applications of category theory in machine learning, touching on gradient-based learning, probability, and equivariant learning.

References

SHOWING 1-10 OF 13 REFERENCES
On Invariance and Selectivity in Representation Learning
TLDR
This paper builds on the idea that data representation, which are learned in an unsupervised manner, can be key to solve the problem of learning "good" data representation which can lower the need of labeled data in machine learning.
Bayesian machine learning via category theory
TLDR
Using categorical methods, models for parametric and nonparametric Bayesian reasoning on function spaces are constructed, thus providing a basis for the supervised learning problem.
Backprop as Functor: A compositional perspective on supervised learning
TLDR
A key contribution is the notion of request function, which provides a structural perspective on backpropagation, giving a broad generalisation of neural networks and linking it with structures from bidirectional programming and open games.
What is a statistical model
This paper addresses two closely related questions, What is a statistical model? and What is a parameter? The notions that a model must make sense, and that a parameter must have a well-defined
∞-Categories for the Working Mathematician
homotopy theory C.1. Lifting properties, weak factorization systems, and Leibniz closure C.1.1. Lemma. Any class of maps characterized by a right lifting property is closed under composition,
Category Theory for the Sciences
TLDR
Category Theory for the Sciences is intended to create a bridge between the vast array of mathematical concepts used by mathematicians and the models and frameworks of such scientific disciplines as computation, neuroscience, and physics.
An Invitation to Applied Category Theory
Category theory is unmatched in its ability to organize and layer abstractions and to find commonalities between structures of all sorts. No longer the exclusive preserve of pure mathematicians, it
The Elements of Statistical Learning
TLDR
Chapter 11 includes more case studies in other areas, ranging from manufacturing to marketing research, and a detailed comparison with other diagnostic tools, such as logistic regression and tree-based methods.
An Introduction to Multivariate Statistical Analysis
TLDR
The introduction to multivariate statistical analysis is universally compatible with any devices to read, and will help you to cope with some harmful bugs inside their desktop computer.
Categories for the working mathematician, volume 5
  • Springer Science & Business Media,
  • 2013
...
...