• Corpus ID: 26572494

DiffSharp: Automatic Differentiation Library

  title={DiffSharp: Automatic Differentiation Library},
  author={Atilim Gunes Baydin and Barak A. Pearlmutter and Jeffrey Mark Siskind},
In this paper we introduce DiffSharp, an automatic differentiation (AD) library designed with machine learning in mind. AD is a family of techniques that evaluate derivatives at machine precision with only a small constant factor of overhead, by systematically applying the chain rule of calculus at the elementary operator level. DiffSharp aims to make an extensive array of AD techniques available, in convenient form, to the machine learning community. These including arbitrary nesting of… 

Figures and Tables from this paper

DiffSharp: An AD Library for .NET Languages
DiffSharp is an algorithmic differentiation or automatic differentiation (AD) library for the .NET ecosystem, which is targeted by the C# and F# languages, among others. The library has been designed
A Hitchhiker’s Guide to Automatic Differentiation
An overview of some of the mathematical principles of Automatic Differentiation is provided, like the matrix-vector product based approach, the idea of lifting functions to the algebra of dual numbers, the method of Taylor series expansion on dual numbers and the application of the push-forward operator.
Mini-symposium on automatic differentiation and its applications in the financial industry
This paper shows how automatic differentiation provides a partial answer to this recent explosion of computation to perform and gives here short introductions to typical cases arising when one use AAD on financial markets.
Principles of Automatic Differentiation
Different descriptions of the Forward Mode of AD, like the matrix-vector product based approach, the idea of lifting functions to the algebra of dual numbers, the method of Taylor series expansion on dual numbers and the application of the push-forward operator are summarised.
Kotlin∇: A shape-safe DSL for differentiable programming
This work presents an algebraically-based implementation of automatic differentiation with shape- safe tensor operations, written in pure Kotlin, which is the first shape-safe AD library fully compatible with the Java type system.
Efficient differentiable programming in a functional array-processing language
In combination, gradient computation with forward-mode AD can be as efficient as reverse mode, and that the Jacobian matrices required for numerical algorithms such as Gauss-Newton and Levenberg-Marquardt can be efficiently computed.
Finally, a Polymorphic Linear Algebra Language
The polymorphic design of Pilatus allows us to use multistage programming and rewrite-based optimisation to recover the performance of specialised code, supporting fixed sized matrices, algebraic optimisations, and fusion.
Compilation and Code Optimization for Data Analytics
The vision of abstraction without regret argues that it is possible to use high-level languages for building performance-critical systems that allow for both productivity and high performance, instead of trading off the former for the latter.


Automatic differentiation in machine learning: a survey
By precisely defining the main differentiation techniques and their interrelationships, this work aims to bring clarity to the usage of the terms “autodiff’, “automatic differentiation”, and “symbolic differentiation" as these are encountered more and more in machine learning settings.
Exploitation of structural sparsity in algorithmic differentiation
This thesis aims to provide a tool, which minimizes non-AD experts effort in application of the reverse mode AD on their problems for large dimensions by presenting algorit hms that allow the application of elimination techniques, which are very close to the Gauss i n elimination performed in sparse LU factorization.
Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity.
Reverse-mode AD in a functional framework: Lambda the ultimate backpropagator
We show that reverse-mode AD (Automatic Differentiation)—a generalized gradient-calculation operator—can be incorporated as a first-class function in an augmented lambda calculus, and therefore into
AUGEM: Automatically generate high performance Dense Linear Algebra kernels on x86 CPUs
A template-based optimization framework, AUGEM, is presented, which can automatically generate fully optimized assembly code for several dense linear algebra kernels, such as GEMM, GEMV, AXPY and DOT, on varying multi-core CPUs without requiring any manual interference from developers.
Who Invented the Reverse Mode of Differentiation
The present author is credited with facilitating the rebirth of the key idea, namely the reverse mode, which has been suggested by several people from various fields since the late 1960s, if not earlier.
Automatic Differentiation of Algorithms: From Simulation to Optimization
Automatic Differentiation of Algorithms provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use.
On the Efficient Computation of Sparsity Patterns for Hessians
Two algorithms to detect the sparsity pattern of Hessians are discussed: An approach for the computation of exact sparsity patterns and a second one for the overestimation of sparsitypatterns.
Nesting forward-mode AD in a functional framework
It is hypothesized that one cannot correctly formulate this operator as a function definition in current pure dialects of Haskell because of the primary technical difficulty in ensuring correctness in the face of nested invocation of that operator.
Leveraging .NET meta-programming components from F#: integrated queries and interoperable heterogeneous execution
This paper explores the use of a modest meta-programming extension to F# to access and leverage the functionality of LINQ and other components, and demonstrates an implementation of language integrated SQL queries using the LINQ/SQLMetal libraries.