Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition

  title={Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition},
  author={Andreas Griewank and Andrea Walther},
  booktitle={Frontiers in applied mathematics},
Algorithmic, or automatic, differentiation (AD) is a growing area of theoretical research and software development concerned with the accurate and efficient evaluation of derivatives for function evaluations given as computer programs. The resulting derivative values are useful for all scientific computations that are based on linear, quadratic, or higher order approximations to nonlinear scalar or vector functions. AD has been applied in particular to optimization, parameter identification… 

An introduction to algorithmic differentiation

This work provides an introduction to AD and presents its basic ideas and techniques, some of its most important results, the implementation paradigms it relies on, the connection it has to other domains including machine learning and parallel computing, and a few of the major open problems in the area.

Special section: Automatic differentiation and its applications

Algorithmic Differentiation of Numerical Methods : Second-Order Tangent and Adjoint Solvers for Systems of Parametrized Nonlinear Equations

Forward and reverse modes of algorithmic differentiation (AD) transform implementations of multivariate vector functions F : IR → IR as computer programs into tangent and adjoint code, respectively, which are of particular interest in large-scale gradient-based nonlinear optimization.

A review of automatic differentiation and its efficient implementation

  • C. Margossian
  • Computer Science
    WIREs Data Mining Knowl. Discov.
  • 2019
Automatic differentiation is a powerful tool to automate the calculation of derivatives and is preferable to more traditional methods, especially when differentiating complex algorithms and mathematical functions.

Algorithmic Differentiation of Numerical Methods : Tangent-Linear and Adjoint Solvers for Systems of Nonlinear Equations

Following the discussion of the proposed terminology, the algorithmic formalism is developed building on prior work by other colleagues and an implementation based on the AD software dco/c++ is presented.

Algorithmic Differentiation of Numerical Methods: Second-Order Adjoint Solvers for Parameterized Systems of Nonlinear Equations

The adjoint mode is of particular interest in large-scale gradient-based nonlinear optimization due to the indepen-dence of its computational cost on the number of free variables and can be eliminated by taking a symbolic approach to the differentiation of the nonlinear system.

Efficient (Partial) Determination of Derivative Matrices via Automatic Differentiation

Here it is shown how the popular graph-coloring approach to AD can be adapted to account for cases where some elements may be constant for all iterates, with resulting gains in efficiency.

Introduction to Automatic Differentiation and MATLAB

A survey of more advanced topics in automatic differentiation includes an introduction to the reverse mode (the authors' implementation is forward mode) and considerations in arbitrary-order multivariable series computation.