• Corpus ID: 197431419

A Differentiable Programming System to Bridge Machine Learning and Scientific Computing

@article{Innes2019ADP,
  title={A Differentiable Programming System to Bridge Machine Learning and Scientific Computing},
  author={Mike Innes and Alan Edelman and Keno Fischer and Chris Rackauckas and Elliot Saba and Viral B. Shah and Will Tebbutt},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.07587}
}
Scientific computing is increasingly incorporating the advancements in machine learning and the ability to work with large amounts of data. [] Key Method We implement this system in the Julia programming language. Our system supports almost all language constructs (control flow, recursion, mutation, etc.) and compiles high-performance code without requiring any user intervention or refactoring to stage computations. This enables an expressive programming model for deep learning, but more importantly, it…

Figures and Tables from this paper

Universal Differential Equations for Scientific Machine Learning

The UDE model augments scientific models with machine-learnable structures for scientifically-based learning and shows how UDEs can be utilized to discover previously unknown governing equations, accurately extrapolate beyond the original data, and accelerate model simulation, all in a time and data-efficient manner.

Differentiable Programming in High-Energy Physics

This Snowmass LOI outlines the potential advantages and challenges of adopting a differentiable programming paradigm in high-energy physics.

Using Differentiable Programming for Flexible Statistical Modeling

This work develops a regression model, inspired by delay differential equations, that can bridge temporal gaps of observations in the central German registry of COVID-19 intensive care cases for predicting future demand and illustrates how differentiable programming can enable simple gradient-based optimization of the model by automatic differentiation.

DiffTaichi: Differentiable Programming for Physical Simulation

We present DiffTaichi, a new differentiable programming language tailored for building high-performance differentiable physical simulators. Based on an imperative programming language, DiffTaichi

Instead of Rewriting Foreign Code for Machine Learning, Automatically Synthesize Fast Gradients

Enzyme synthesizes gradients for programs written in any language whose compiler targets LLVM IR including C, C++, Fortran, Julia, Rust, Swift, MLIR, etc., thereby providing native AD capabilities in these languages.

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs

Predictive coding converges asymptotically (and in practice, rapidly) to exact backprop gradients on arbitrary computation graphs using only local learning rules, raising the potential that standard machine learning algorithms could in principle be directly implemented in neural circuitry.

JAX, M.D.: End-to-End Differentiable, Hardware Accelerated, Molecular Dynamics in Pure Python

The architecture of JAX MD is explored, an end-to-end differentiable MD package written entirely in Python that can be just-in-time compiled to CPU, GPU, or TPU, and lets researchers easily incorporate machine learning models into their workflows.

A guide to machine learning for biologists

This Review provides a gentle introduction to a few key machine learning techniques, including the most recently developed and widely used techniques involving deep neural networks.

Research Project Proposal: Structured Learning

The aim is to insert algorithmic modules with a predefined behavior in the architecture while maintaining the end-to-end training procedure, and this emerging field is called Structured Learning: algorithms provide structure and neural networks provide flexibility to learn and adapt from data.
...

References

SHOWING 1-10 OF 50 REFERENCES

TensorFlow: A system for large-scale machine learning

The TensorFlow dataflow model is described and the compelling performance that Tensor Flow achieves for several real-world applications is demonstrated.

Automatic differentiation in machine learning: a survey

By precisely defining the main differentiation techniques and their interrelationships, this work aims to bring clarity to the usage of the terms “autodiff’, “automatic differentiation”, and “symbolic differentiation" as these are encountered more and more in machine learning settings.

Demystifying differentiable programming: shift/reset the penultimate backpropagator

This paper uncovers a tight connection between reverse-mode AD and delimited continuations, which permits implementing reverse- mode AD purely via operator overloading and without managing any auxiliary data structures, and shows how this formulation of AD can be fruitfully combined with multi-stage programming (staging), leading to an efficient implementation.

Fashionable Modelling with Flux

A framework named Flux is presented that shows how further refinement of the core ideas of machine learning, built upon the foundation of the Julia programming language, can yield an environment that is simple, easily modifiable, and performant.

A DIFFERENTIABLE PHYSICS ENGINE FOR DEEP LEARNING IN ROBOTICS

This paper proposes an implementation of a modern physics engine, which can differentiate control parameters, which is implemented for both CPU and GPU, and shows how such an engine speeds up the optimization process, even for small problems.

Julia: A Fresh Approach to Numerical Computing

The Julia programming language and its design is introduced---a dance between specialization and abstraction, which recognizes what remains the same after computation, and which is best left untouched as they have been built by the experts.

End-to-End Differentiable Physics for Learning and Control

This paper demonstrates how to perform backpropagation analytically through a physical simulator defined via a linear complementarity problem, and highlights the system's ability to learn physical parameters from data, efficiently match and simulate observed visual behavior, and readily enable control via gradient-based planning methods.

Neural Ordinary Differential Equations

This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Automatic Full Compilation of Julia Programs and ML Models to Cloud TPUs

A method and implementation is described for offloading suitable sections of Julia programs to TPUs via this new API and the Google XLA compiler, able to completely fuse the forward pass of a VGG19 model expressed as a Julia program into a single TPU executable to be offloaded to the device.

Reverse-mode AD in a functional framework: Lambda the ultimate backpropagator

We show that reverse-mode AD (Automatic Differentiation)—a generalized gradient-calculation operator—can be incorporated as a first-class function in an augmented lambda calculus, and therefore into