Backprop as Functor: A compositional perspective on supervised learning

@article{Fong2019BackpropAF,
  title={Backprop as Functor: A compositional perspective on supervised learning},
  author={B. Fong and David I. Spivak and R{\'e}my Tuy{\'e}ras},
  journal={2019 34th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS)},
  year={2019},
  pages={1-13}
}
A supervised learning algorithm searches over a set of functions $A\rightarrow B$ parametrised by a space $P$ to find the best approximation to some ideal function $f:A\rightarrow B$. It does this by taking examples $(a, f(a))\in A\times B$, and updating the parameter according to some rule. We define a category where these update rules may be composed, and show that gradient descent-with respect to a fixed step size and an error function satisfying a certain property-defines a monoidal functor… Expand
Learning Functors using Gradient Descent
TLDR
A category-theoretic formalism around a neural network system called CycleGAN, a general approach to unpaired image-to-image translation that has been getting attention in the recent years, is built and it is shown that enforcing cycle-consistencies amounts to enforcing composition invariants in this category. Expand
1 M ar 2 02 1 Learners ’ languages
In “Backprop as functor”, the authors show that the fundamental elements of deep learning—gradient descent and backpropagation—can be conceptualized as a strong monoidal functor Para(Euc) → LearnExpand
Compositional Deep Learning
TLDR
This thesis builds a category-theoretic formalism around a class of neural networks exemplified by CycleGAN, and uses the framework to conceive a novel neural network architecture whose goal is to learn the task of object insertion and object deletion in images with unpaired data. Expand
Deep Learning, Grammar Transfer, and Transportation Theory
TLDR
G grammar transfer is used to demonstrate a paradigm that connects artificial intelligence and human intelligence and it is demonstrated that this learning model can learn a grammar intelligently in general, but fails to follow the optimal way of learning. Expand
Dioptics: a Common Generalization of Open Games and Gradient-Based Learners
Compositional semantics have been shown for machine-learning algorithms [FST18] and open games [Hed18]; at SYCO 1, remarks were made noting the high degree of overlap in character and analogy betweenExpand
Lenses and Learners
TLDR
This paper shows both that there is a faithful, identity-on-objects symmetric monoidal functor embedding a category of asymmetric lenses into the category of learners, and furthermore there is such a functorEmbedding the categories of learners into a categories of symmetric lenses. Expand
Characterizing the invariances of learning algorithms using category theory
  • K. Harris
  • Computer Science, Mathematics
  • ArXiv
  • 2019
TLDR
The framework for an invariant learning algorithm is a natural transformation between two functors from the product of these categories to the category of sets, representing training datasets and learned functions respectively. Expand
Reverse Derivative Ascent: A Categorical Approach to Learning Boolean Circuits
TLDR
Reverse Derivative Ascent is introduced: a categorical analogue of gradient based methods for machine learning that allows us to learn the parameters of boolean circuits directly, in contrast to existing binarised neural network approaches. Expand
Neural Nets via Forward State Transformation and Backward Loss Transformation
This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved --- both forward and backward --- in order to develop a semantical/logical perspectiveExpand
Differentiable Causal Computations via Delayed Trace
TLDR
The delayed trace operation provides a feedback mechanism in $\mathrm{St}(\mathbb{C})$ with an implicit guardedness guarantee and a differential operator is constructed using an abstract version of backpropagation through time using a technique from machine learning based on unrolling of functions. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 21 REFERENCES
Lenses and Learners
TLDR
This paper shows both that there is a faithful, identity-on-objects symmetric monoidal functor embedding a category of asymmetric lenses into the category of learners, and furthermore there is such a functorEmbedding the categories of learners into a categories of symmetric lenses. Expand
Relational lenses: a language for updatable views
TLDR
The approach is to define a bi-directional query language, in which every expression can be read bot(from left to right) as a view definition and (from right to left) as an update policy. Expand
A compositional framework for Markov processes
We define the concept of an “open” Markov process, or more precisely, continuous-time Markov chain, which is one where probability can flow in or out of certain states called “inputs” and “outputs.”Expand
The algebra of open and interconnected systems
  • B. Fong
  • Mathematics, Computer Science
  • 2016
TLDR
This thesis develops the theory of hypergraph categories and introduces the tools of decorated cospans and corelations, a more powerful version that permits construction of all hyper graph categories and hypergraph functors. Expand
Geometric Deep Learning: Going beyond Euclidean data
TLDR
Deep neural networks are used for solving a broad range of problems from computer vision, natural-language processing, and audio analysis where the invariances of these structures are built into networks used to model them. Expand
Algebras of Open Dynamical Systems on the Operad of Wiring Diagrams
TLDR
This paper uses the language of operads to study the algebraic nature of assembling complex dynamical systems from an interconnection of simpler ones, and defines two W-algebras, G and L, which associate semantic content to the structures in W. Expand
From open learners to open games
TLDR
It is proved that there is a faithful symmetric monoidal functor from the former to the latter, which means that any supervised neural network can be seen as an open game in a canonical way. Expand
Understanding deep image representations by inverting them
Image representations, from SIFT and Bag of Visual Words to Convolutional Neural Networks (CNNs), are a crucial component of almost any image understanding system. Nevertheless, our understanding ofExpand
Categories for the Working Mathematician
I. Categories, Functors and Natural Transformations.- 1. Axioms for Categories.- 2. Categories.- 3. Functors.- 4. Natural Transformations.- 5. Monics, Epis, and Zeros.- 6. Foundations.- 7. LargeExpand
Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning
TLDR
This entirely diagrammatic presentation of quantum theory represents the culmination of ten years of research, uniting classical techniques in linear algebra and Hilbert spaces with cutting-edge developments in quantum computation and foundations. Expand
...
1
2
3
...