Matrix inversion algorithms by means of automatic differentiation

@article{Kubota1994MatrixIA,
  title={Matrix inversion algorithms by means of automatic differentiation},
  author={Koichi Kubota},
  journal={Applied Mathematics Letters},
  year={1994},
  volume={7},
  pages={19-22}
}
  • Koichi Kubota
  • Published 1 July 1994
  • Mathematics
  • Applied Mathematics Letters
Abstract There are many matrix inversion algorithms, some being widely known and others not as widely known. We will show that some of known elaborate formulas for matrix inversion can be derived by differentiating the logarithm of the determinant of a matrix by means of the top-down algorithm of automatic differentiation. 
Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation. It highlights in particular the remarkableExpand
Differentiation of generalized inverses for rational and polynomial matrices
TLDR
This paper combines the Layton's method together with the representation of the Moore-Penrose inverse of one-variable polynomial matrix from [24] and developed an algorithm for computing the gradient of theMoore- Penrose inverse for one- variable polynometric matrix. Expand
An extended collection of matrix derivative results for forward and reverse mode algorithmic dieren tiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation (AD). It highlights in particular the remarkableExpand
Differentiation of matrix functionals using triangular factorization
TLDR
It is shown how the methods apply to several applications where the functional is a log determinant, including spline smoothing, covariance selection and restricted maximum likelihood. Expand
An adjoint for likelihood maximization
The process of likelihood maximization can be found in many different areas of computational modelling. However, the construction of such models via likelihood maximization requires the solution of aExpand
Kriging and the Importance of Efficient Hyperparameter Tuning
The process of likelihood maximization can be found in many different areas of computational modelling. However, the construction of such models via likelihood maximization requires the solution of aExpand
User guide for mad---a matlab automatic differentiation toolbox
TLDR
This user guide covers installation of Mad on UNIX and PC platforms, use of high-level interfaces for solving ODEs and optimization problems outside of the TOMLAB framework, and basic and advanced use of the forward mode of automatic differentiation. Expand
User Guide for MAD - A Matlab Automatic Differentiation Package, TOMLAB/MAD,Version 1.4 The Forward Mode.
TLDR
This user guide covers installation of Mad on UNIX and PC platforms, use of high-level interfaces for solving ODEs and optimization problems outside of the TOMLAB framework, and basic and advanced use of the forward mode of automatic differentiation. Expand
Proper orthogonal decomposition & kriging strategies for design
TLDR
The following thesis aims to address the total likelihood optimisation cost through the application of an adjoint of the likelihood function within a hybridised optimisation algorithm and the development of a novel optimisation strategy employing a reparameterisation of the original design problem through proper orthogonal decomposition. Expand

References

SHOWING 1-7 OF 7 REFERENCES
Automatic Differentiation: Techniques and Applications
  • L. B. Rall
  • Computer Science
  • Lecture Notes in Computer Science
  • 1981
TLDR
This paper presents a procedure for automatic computation of gradients, Jacobians, Hessians, and applications to optimization in the form of a discrete-time model. Expand
The Complexity of Partial Derivatives
TLDR
Using the nonscalar complexity in k, the complexity of single power sums, single elementary symmetric functions, the resultant and the discriminant as root functions are determined up to order of magnitude. Expand
Automatic computation of partial derivatives and rounding error estimates with applications to large-scale systems of nonlinear equations
Abstract Recently, a new approach has been proposed to efficiently compute the accurate values of partial derivatives of a function or functions, and simultaneously to estimate the rounding errors inExpand
Simultaneous computation of functions, partial derivatives and estimates of rounding errors —Complexity and practicality—
A practical approach is proposed to the problem of simultaneously computing a function, its partial derivatives with respect to all the variables, and an estimate of the rounding error incurred inExpand
A taxonomy of automatic differentiation tools
TLDR
A taxonomy of AD tools is presented that places AD tools into the Elemental, Extensional, Integral, Operational and Symbolic classes and examines each tool individually with respect to the mode of di erentiation used and the degree of derivatives computed. Expand