Compositional Deep Learning
@article{Gavranovic2019CompositionalDL, title={Compositional Deep Learning}, author={Bruno Gavranovic}, journal={ArXiv}, year={2019}, volume={abs/1907.08292} }
Neural networks have become an increasingly popular tool for solving many real-world problems. They are a general framework for differentiable optimization which includes many other machine learning approaches as special cases. In this thesis we build a category-theoretic formalism around a class of neural networks exemplified by CycleGAN. CycleGAN is a collection of neural networks, closed under composition, whose inductive bias is increased by enforcing composition invariants, i.e. cycle…
Figures and Tables from this paper
6 Citations
Categorical Foundations of Gradient-Based Learning
- Computer ScienceESOP
- 2022
A categorical foundation of gradientbased machine learning algorithms in terms of lenses, parametrised maps, and reverse derivative categories is proposed, which encompasses a variety of gradient descent algorithms such as ADAM, AdaGrad, and Nesterov momentum.
L G ] 2 M ar 2 02 1 Categorical Foundations of Gradient-Based Learning
- Computer Science
- 2021
A categorical foundation of gradientbased machine learning algorithms in terms of lenses, parametrised maps, and reverse derivative categories is proposed, which encompasses a variety of gradient descent algorithms such as ADAM, AdaGrad, and Nesterov momentum.
Categorical Stochastic Processes and Likelihood
- Mathematics, Computer ScienceCompositionality
- 2021
A category-theoretic perspective on the relationship between probabilistic modeling and gradient based optimization is taken and a way to compose the likelihood functions of these models is defined.
Category Theory in Machine Learning
- Computer ScienceArXiv
- 2021
This work aims to document the motivations, goals and common themes across these applications of category theory in machine learning, touching on gradient-based learning, probability, and equivariant learning.
Compositional Game Theory, Compositionally (preliminary version)
- Mathematics
- 2020
We present a new compositional approach to compositional game theory (CGT) based upon Arrows, a concept originally from functional programming, closely related to Tambara modules, and operators to…
Categorical composable cryptography
- Computer Science, MathematicsFoSSaCS
- 2022
It is shown that protocols secure against abstract attacks form a symmetric monoidal category, thus giving an abstract model of composable security definitions in cryptography, able to incorporate computational security, set-up assumptions and various attack models in a modular, flexible fashion.
References
SHOWING 1-10 OF 37 REFERENCES
Deep learning generalizes because the parameter-function map is biased towards simple functions
- Computer ScienceICLR
- 2019
This paper argues that the parameter-function map of many DNNs should be exponentially biased towards simple functions, and provides clear evidence for this strong simplicity bias in a model DNN for Boolean functions, as well as in much larger fully connected and convolutional networks applied to CIFAR10 and MNIST.
Backprop as Functor: A compositional perspective on supervised learning
- Computer Science2019 34th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS)
- 2019
A key contribution is the notion of request function, which provides a structural perspective on backpropagation, giving a broad generalisation of neural networks and linking it with structures from bidirectional programming and open games.
A Style-Based Generator Architecture for Generative Adversarial Networks
- Computer Science2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2019
An alternative generator architecture for generative adversarial networks is proposed, borrowing from style transfer literature, that improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation.
Characterizing the invariances of learning algorithms using category theory
- Mathematics, Computer ScienceArXiv
- 2019
The framework for an invariant learning algorithm is a natural transformation between two functors from the product of these categories to the category of sets, representing training datasets and learned functions respectively.
Learning to learn by gradient descent by gradient descent
- Computer ScienceNIPS
- 2016
This paper shows how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way.
Improved Training of Wasserstein GANs
- Computer ScienceNIPS
- 2017
This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
The simple essence of automatic differentiation (Differentiable functional programming made easy)
- Computer ScienceArXiv
- 2018
A simple, generalized AD algorithm calculated from a simple, natural specification, inherently parallel-friendly, correct by construction, and usable directly from an existing programming language with no need for new data types or programming style, thanks to use of an AD-agnostic compiler plugin.
A Compositional Framework for Passive Linear Networks
- Mathematics
- 2015
Passive linear networks are used in a wide variety of engineering applications, but the best studied are electrical circuits made of resistors, inductors and capacitors. We describe a category where…
Causal Theories: A Categorical Perspective on Bayesian Networks
- Mathematics
- 2012
In this dissertation we develop a new formal graphical framework for causal reasoning. Starting with a review of monoidal categories and their associated graphical languages, we then revisit…
Mathematical Foundations for a Compositional Distributional Model of Meaning
- MathematicsArXiv
- 2010
A mathematical framework for a unification of the distributional theory of meaning in terms of vector space models and a compositional theory for grammatical types, for which the type reductions of Pregroups are lifted to morphisms in a category, a procedure that transforms meanings of constituents into a meaning of the (well-typed) whole.