Flux: Elegant machine learning with Julia

@article{Innes2018FluxEM,
  title={Flux: Elegant machine learning with Julia},
  author={Mike Innes},
  journal={J. Open Source Softw.},
  year={2018},
  volume={3},
  pages={602}
}
  • Mike Innes
  • Published 3 May 2018
  • Computer Science
  • J. Open Source Softw.
Flux is library for machine learning (ML), written using the numerical computing language Julia (Bezanson et al. 2017). The package allows models to be written using Julia’s simple mathematical syntax, and applies automatic differentiation (AD) to seamlessly calculate derivatives and train the model. Meanwhile, it makes heavy use of Julia’s language and compiler features to carry out code analysis and make optimisations. For example, Julia’s GPU compilation support (Besard, Foket, and De Sutter… 

Interoperating Deep Learning models with ONNX.jl

ONNX.jl is an Open Neural Network Exchange backend for the Flux.jl deep learning framework, that compiles Julia code to XLA: an advanced compiler for linear algebra that is capable of greatly optimizing speed and memory usage in large deep learning models.

BetaML: The Beta Machine Learning Toolkit, a self-contained repository of Machine Learning algorithms in Julia

A series of machine learning algorithms has been implemented and bundled together with several “utility” functions in a single package for the Julia programming language. Currently, algorithms are

TensorFlow.jl: An Idiomatic Julia Front End for TensorFlow

TensorFlow.jl is a Julia client library for the TensorFlow deep-learning framework that allows users to define Tensor Flow graphs using Julia syntax, which are interchangeable with the graphs produced by Google’s first-party Python Tensorflow client and can be used to perform training or inference on machine-learning models.

MLJ: A Julia package for composable Machine Learning

In this design overview, the chief novelties of the MLJ framework are detailed, together with the clear benefits of Julia over the dominant multi-language alternatives.

TensorFlow Eager: A Multi-Stage, Python-Embedded DSL for Machine Learning

TensorFlow Eager provides an imperative front-end to TensorFlow that executes operations immediately and a JIT tracer that translates Python functions composed of Tensor Flow operations into executable dataflow graphs.

The JuliaConnectoR: a functionally oriented interface for integrating Julia in R

The R package JuliaConnectoR is developed, available from the CRAN repository and GitHub, for making advanced deep learning tools available and enables a clean programming style by avoiding state in Julia that is not visible in the R workspace.

BackPACK: Packing more into backprop

BackPACK is introduced, an efficient framework built on top of PyTorch that extends the backpropagation algorithm to extract additional information from first-and second-order derivatives to address the problem of automatic differentiation frameworks not supporting other quantities such as the variance of the mini-batch gradients.

Flashlight: Enabling Innovation in Tools for Machine Learning

Flashlight is introduced, an open-source library built to spur innovation in machine learning tools and systems by priori-tizing open, modular, customizable internals and state-of-the-art, research-ready models and training setups across a variety of domains.

ALGORITHMIC DIFFERENTIATION

Zygote is designed to address the needs of both the machine learning and scientific computing communities, who have historically been siloed by their very different tools, and to enable differentiable programming (∂P ), in which arbitrary numerical programs can make use of gradient-based optimisation.

Equinox: neural networks in JAX via callable PyTrees and filtered transformations

‘Equinox’ is introduced, a small neural network library showing how a PyTorch-like class-based approach may be admitted without sacrificing JAX-like functional programming.
...

References

SHOWING 1-3 OF 3 REFERENCES

Effective Extensible Programming: Unleashing Julia on GPUs

This work proposes compiler infrastructure to efficiently add support for new hardware or environments to an existing programming language, and significantly lower the cost to implement and maintain the new compiler, and facilitate reuse of existing application code.

Julia: A Fresh Approach to Numerical Computing

The Julia programming language and its design is introduced---a dance between specialization and abstraction, which recognizes what remains the same after computation, and which is best left untouched as they have been built by the experts.

Minibatch.jl.” 2018. https://github.com/jekbradbury/ Minibatch.jl

  • 2018