• Corpus ID: 239009404

A Brief Introduction to Automatic Differentiation for Machine Learning

@article{Harrison2021ABI,
  title={A Brief Introduction to Automatic Differentiation for Machine Learning},
  author={Davan Harrison},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.06209}
}
Machine learning, and neural network models in particular, have been improving the state of the art performance on many artificial intelligence related tasks. Neural network models are typically implemented using frameworks that perform gradient based optimization methods to fit a model to a dataset. These frameworks use a technique of calculating derivatives called automatic differentiation (AD) which removes the burden of performing derivative calculations from the model designer. In this… 

Figures and Tables from this paper

References

SHOWING 1-8 OF 8 REFERENCES
Automatic differentiation in machine learning: a survey
TLDR
By precisely defining the main differentiation techniques and their interrelationships, this work aims to bring clarity to the usage of the terms “autodiff’, “automatic differentiation”, and “symbolic differentiation" as these are encountered more and more in machine learning settings.
Automatic differentiation in PyTorch
TLDR
An automatic differentiation module of PyTorch is described — a library designed to enable rapid research on machine learning models that focuses on differentiation of purely imperative programs, with a focus on extensibility and low overhead.
TensorFlow: A system for large-scale machine learning
TLDR
The TensorFlow dataflow model is described and the compelling performance that Tensor Flow achieves for several real-world applications is demonstrated.
Automatic differentiation in ML: Where we are and where we should be going
TLDR
A new graph-based intermediate representation (IR) is introduced which specifically aims to efficiently support fully-general AD for array programming, and naturally supports function calls, higher-order functions and recursion, making ML models easier to implement.
Theano: Deep Learning on GPUs with Python
TLDR
This paper presents Theano, a framework in the Python programming language for defining, optimizing and evaluating expressions involving high-level operations on tensors, and adds automatic symbolic differentiation, GPU support, and faster expression evaluation.
Dataflow Programming Concept , Languages and Applications
TLDR
This survey describes how visual programming languages built on top of DFP can be used for end-user programming and how easy it is to achieve concurrency by applying the paradigm, without any development overhead.
The on-line graphical specification of computer procedures
TLDR
This paper is intended to demonstrate the efforts of the Massachusetts Institute of Technology’s graduate students to improve the quality of their teaching and research in the field of electrical engineering.
Advances in dataflow programming languages
TLDR
How dataflow programming evolved toward a hybrid von Neumann dataflow formulation, and adopted a more coarse-grained approach is discussed.