• Corpus ID: 240353913

Equinox: neural networks in JAX via callable PyTrees and filtered transformations

@article{Kidger2021EquinoxNN,
  title={Equinox: neural networks in JAX via callable PyTrees and filtered transformations},
  author={Patrick Kidger and Cristian Garcia},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.00254}
}
JAX and PyTorch are two popular Python autodifferentiation frameworks. JAX is based around pure functions and functional programming. PyTorch has popularised the use of an object-oriented (OO) class-based syntax for defining parameterised functions, such as neural networks. That this seems like a fundamental difference means current libraries for building parameterised functions in JAX have either rejected the OO approach entirely (Stax) or have introduced OO-tofunctional transformations… 
1 Citations

Optical design, analysis, and calibration using ∂Lux

TLDR
This manuscript explores some of the many ways to harness the potential of these codes, particularly focusing on the application example provided by the Toliman space telescope mission.

References

SHOWING 1-10 OF 15 REFERENCES

PyTorch: An Imperative Style, High-Performance Deep Learning Library

TLDR
This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.

Flux: Elegant machine learning with Julia

  • Mike Innes
  • Computer Science
    J. Open Source Softw.
  • 2018
TLDR
JuliaFlux is library for machine learning (ML), written using the numerical computing language Julia, and applies automatic differentiation (AD) to seamlessly calculate derivatives and train the model.

Fashionable Modelling with Flux

TLDR
A framework named Flux is presented that shows how further refinement of the core ideas of machine learning, built upon the foundation of the Julia programming language, can yield an environment that is simple, easily modifiable, and performant.

Julia: A Fresh Approach to Numerical Computing

TLDR
The Julia programming language and its design is introduced---a dance between specialization and abstraction, which recognizes what remains the same after computation, and which is best left untouched as they have been built by the experts.

Swift for TensorFlow: A portable, flexible platform for deep learning

TLDR
Deep learning platform Swift for TensorFlow combines a language-integrated automatic differentiation system and multiple Tensor implementations within a modern ahead-of-time compiled language oriented around mutable value semantics.

Decomposing reverse-mode automatic differentiation

We decompose reverse-mode automatic differentiation into (forward-mode) linearization followed by transposition. Doing so isolates the essential difference between forwardand reverse-mode AD, and

functorch: JAX-like composable function transforms for PyTorch

  • 2021

Haiku: Sonnet for JAX

  • Version 0.0.3
  • 2020

JAX: composable transformations of Python+NumPy programs. Version 0.2.5

  • 2018

torchtyping. Accessed 2021

  • URL: https://github.com/patrick-kidger/torchtyping
  • 2021