Introduction to Automatic Differentiation and MATLAB Object-Oriented Programming

  title={Introduction to Automatic Differentiation and MATLAB Object-Oriented Programming},
  author={Richard D. Neidinger},
  journal={SIAM Rev.},
An introduction to both automatic differentiation and object-oriented programming can enrich a numerical analysis course that typically incorporates numerical differentiation and basic MATLAB computation. Automatic differentiation consists of exact algorithms on floating-point arguments. This implementation overloads standard elementary operators and functions in MATLAB with a derivative rule in addition to the function value; for example, $\sin u$ will also compute $(\cos u)\ast u^{\prime… 

Figures and Tables from this paper

An efficient overloaded method for computing derivatives of mathematical functions in MATLAB

An object-oriented method is presented that computes without truncation the error derivatives of functions defined by MATLAB computer codes and has the feature that the derivatives are generated by simply evaluating the function on an instance of the class, thus making the method straightforward to use while simultaneously enabling differentiation of highly complex functions.

Automatic Differentiation in Julia with Applications to Numerical Solution of PDEs

Investigation of whether the new programming language called Julia can be used both as a language for quickly prototyping new oil recovery simulators, as well as implementing highly efficient industrial simulators shows promising results suggesting that Julia may be a language enabling making prototypes of simulators using an AD tool like local AD.


The benefits of the Automatic Differentiation comparing to all other approaches to solve the Jacobian Elements in Power Flow Equations are explained.

Efficient Calculation of Derivatives using Automatic Differentiation

This thesis explores a method of computation that calculates the derivatives of mathematical functions to floating-point precision, without introducing any extra complexity to the code that uses it, and how this can be used to help solve computational problems in optimization and reservoir simulation using the Python programming language.

Automatic differentiation of hybrid models Illustrated by Diffedge Graphic Methodology. (Survey)

The automatic differentiation of hybrid models, viz. models that may contain delays, logical tests and discontinuities or loops, is investigated and the result of automatic differentiation is a new block diagram that can be easily translated to produce real time embedded programs.

Combinatory Adjoints and Differentiation

We develop a compositional approach for automatic and symbolic differentiation based on categorical constructions in functional analysis where derivatives are linear functions on abstract vectors

Dual Numbers and Automatic Differentiation to Efficiently Compute Velocities and Accelerations

The forward mode of AD by using dual numbers is implemented to develop efficient methods for computing velocities and accelerations and gradients and Hessians.

Compressible Flow and Rapid Prototyping

  • Computer Science
    An Introduction to Reservoir Simulation Using MATLAB/GNU Octave
  • 2019
This chapter develops a compact and transparent solver for compressible flow and then extends the basic single-phase model to include pressure-dependent viscosity, nonNewtonian fluid behavior, and temperature effects.



Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition

This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity.

Automatic differentiation : applications, theory, and implementations

Automatic Differentiation: A Tool for Variational Data Assimilation and Adjoint Sensitivity Analysis for Flood Modeling.

An efficient overloaded implementation of forward mode automatic differentiation in MATLAB

The Mad package described here facilitates the evaluation of first derivatives of multidimensional functions that are defined by computer codes written in MATLAB through the separation of the linear combination of derivative vectors into a separate derivative vector class derivvec.

An efficient method for the numerical evaluation of partial derivatives of arbitrary order

The key ideas are a hyperpyramid data structure and a generalized Leibniz's rule which produces any partial derivative by forming the minimum number of products (between two lower partials) together with a product of binomial coefficients.

Computing multivariable Taylor series to arbitrary order

This APL*PLUS III implementation loops through one nested reference array and takes sub-arrays from another for a practical solution to this problem that can make tremendous demands on time and space.

Jacobian code generated by source transformation and vertex elimination can be as efficient as hand-coding

It is shown that the Jacobian code obtained by EliAD is as efficient as hand-coded Jacobian codes and between 2 to 20 times more efficient than that produced by current, state of the art, Automatic Differentiation tools even when such tools make use of sophisticated techniques such as sparse Jacobian compression.

Automatic Differentiation of Algorithms: From Simulation to Optimization

Automatic Differentiation of Algorithms provides a comprehensive and authoritative survey of all recent developments, new techniques, and tools for AD use.

ATOMFT: solving ODEs and DAEs using Taylor series

Advances in Automatic Differentiation

This collection covers advances in automatic differentiation theory and practice and discusses various applications, which provide insight into effective strategies for using automatic differentiation for inverse problems and design optimization.

Robust Aircraft Conceptual Design Using Automatic Differentiation in Matlab

Gradient-based constrained optimisation of the stochastic model is shown to be more efficient using AD-obtained gradients than finite-differencing and a post-optimality analysis confirms the attractiveness of the Sigma-Point technique for uncertainty propagation.