Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation

@inproceedings{Giles2008CollectedMD,
  title={Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation},
  author={Michael B. Giles},
  year={2008}
}
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation. It highlights in particular the remarkable contribution of a 1948 paper by Dwyer and Macphail which derives the linear and adjoint sensitivities of a matrix product, inverse and determinant, and a number of related results motivated by applications in multivariate analysis in statistics. 
On evaluating higher-order derivatives of the QR decomposition of tall matrices with full column rank in forward and reverse mode algorithmic differentiation
We address the task of higher-order derivative evaluation of computer programs that contain QR decompositions of tall matrices with full column rank. The approach is a combination of univariateExpand
Adjoints and Automatic (Algorithmic) Differentiation in Computational Finance
Two of the most important areas in computational finance: Greeks and, respectively, calibration, are based on efficient and accurate computation of a large number of sensitivities. This paper givesExpand
Efficient Automatic Differentiation of Matrix Functions
Forward and reverse mode automatic differentiation methods for functions that take a vector argument make derivative computation efficient. However, the determinant and inverse of a matrix are notExpand
Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version)
We derive algorithms for higher order derivative computation of the rectangular QR and eigenvalue decomposition of symmetric matrices with distinct eigenvalues in the forward and reverse mode ofExpand
A Note on Adjoint Linear Algebra
TLDR
A new proof for adjoint systems of linear equations is presented, built on the principles of Algorithmic Differentiation, that yields adjoint inner vector, matrix-vector, and matrix-matrix products leading to an alternative proof for first- as well as higher-order adjoint linear systems. Expand
On the Efficient Evaluation of Higher-Order Derivatives of Real-Valued Functions Composed of Matrix Operations
Two different hierarchical levels of algorithmic differentiation are compared: the traditional approach and a higher-level approach where matrix operations are considered to be atomic. MoreExpand
Algorithmic Differentiation of Numerical Methods : Second-Order Tangent and Adjoint Solvers for Systems of Parametrized Nonlinear Equations
Forward and reverse modes of algorithmic differentiation (AD) transform implementations of multivariate vector functions F : IR → IR as computer programs into tangent and adjoint code, respectively.Expand
Computing Higher Order Derivatives of Matrix and Tensor Expressions
TLDR
This work presents an algorithmic framework for computing matrix and tensor derivatives that extends seamlessly to higher order derivatives and shows a speedup between one and four orders of magnitude over state-of-the-art frameworks when evaluatingHigher order derivatives. Expand
Efficient Higher Order Derivatives of Objective Functions Composed of Matrix Operations
TLDR
A method that is a combination of two well-known techniques from Algorithmic Differentiation: univariate Taylor propagation on scalars (UTPS) and first-order forward and reverse on matrices (UTPM), which inherits many desirable properties. Expand
Algorithmic differentiation in Python with AlgoPy
TLDR
AlgoPy provides the means to compute derivatives of arbitrary order and Taylor approximations of such programs as NumPy, based on a combination of univariate Taylor polynomial arithmetic and matrix calculus in the (combined) forward/reverse mode of Algorithmic Differentiation (AD). Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 22 REFERENCES
An extended collection of matrix derivative results for forward and reverse mode algorithmic dieren tiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation (AD). It highlights in particular the remarkableExpand
Matrix inversion algorithms by means of automatic differentiation
Abstract There are many matrix inversion algorithms, some being widely known and others not as widely known. We will show that some of known elaborate formulas for matrix inversion can be derived byExpand
Some Applications of Matrix Derivatives in Multivariate Analysis
Abstract It is claimed that the reasons for using matrices of derivatives, in appropriate situations, are as compelling as those for using matrices. This paper provides basic material for such use.Expand
Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
TLDR
This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity. Expand
Old and New Matrix Algebra Useful for Statistics
The partials with respect to the numerator are laid out according to the shape of Y while the partials with respect to the denominator are laid out according to the transpose of X. For example, dy/dxExpand
ADMAT : Automatic differentiation in MATLAB using object oriented methods ∗
Differentiation is one of the fundamental problems in numerical mathematics. The solution of many optimization problems and other applications require knowledge of the gradient, the Jacobian matrix,Expand
An efficient overloaded implementation of forward mode automatic differentiation in MATLAB
  • S. Forth
  • Mathematics, Computer Science
  • TOMS
  • 2006
TLDR
The Mad package described here facilitates the evaluation of first derivatives of multidimensional functions that are defined by computer codes written in MATLAB through the separation of the linear combination of derivative vectors into a separate derivative vector class derivvec. Expand
ADMIT-1: automatic differentiation and MATLAB interface toolbox
TLDR
This article provides an introduction to the design and usage of ADMIT-1, a generic automatic differentiation tool that enables the computation of sparse Jacobian and Hessian matrices from a MATLAB environment. Expand
Using Complex Variables to Estimate Derivatives of Real Functions
A method to approximate derivatives of real functions using complex variables which avoids the subtractive cancellation errors inherent in the classical derivative approximations is described.Expand
Jacobians of matrix transformations and functions of matrix argument
Jacobians of matrix transformations Jacobians in orthogonal and related transformations Jacobians in the complex case transformations involving Eigenvalues and unitary matrices some special functionsExpand
...
1
2
3
...