Corpus ID: 15833207

Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version)

@article{Walter2010AlgorithmicDO,
  title={Algorithmic Differentiation of Linear Algebra Functions with Application in Optimum Experimental Design (Extended Version)},
  author={Sebastian F. Walter and Lutz Lehmann},
  journal={ArXiv},
  year={2010},
  volume={abs/1001.1654}
}
We derive algorithms for higher order derivative computation of the rectangular QR and eigenvalue decomposition of symmetric matrices with distinct eigenvalues in the forward and reverse mode of algorithmic differentiation (AD) using univariate Taylor propagation of matrices (UTPM). Linear algebra functions are regarded as elementary functions and not as algorithms. The presented algorithms are implemented in the BSD licensed AD tool ALGOPY. Numerical tests show that the UTPM algorithms derived… Expand
QR and LQ Decomposition Matrix Backpropagation Algorithms for Square, Wide, and Deep Matrices and Their Software Implementation
This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_{m, n}$, that are either square (m = n), wide (m n), with rank $k = min(m, n)$. Furthermore, we deriveExpand
A Differentiable Contact Model to Extend Lagrangian and Hamiltonian Neural Networks for Modeling Hybrid Dynamics
TLDR
The proposed contact model extends the scope of Lagrangian and Hamiltonian neural networks by allowing simultaneous learning of contact properties and system properties, and can also accommodate inequality constraints, such as limits on the joint angles. Expand
Neural Graph Matching Network: Learning Lawler's Quadratic Assignment Problem with Extension to Hypergraph and Multiple-graph Matching
TLDR
This paper presents a QAP network directly learning with the affinity matrix (equivalently the association graph) whereby the matching problem is translated into a vertex classification task, and is the first network to directly learn with the general Lawlers QAP. Expand
Using flow models with sensitivities to study cost efficient monitoring programs of co2 storage sites
A key part of planning CO2 storage sites is to devise a monitoring strategy. The aim of this strategy is to fulfill the requirements of legislations and lower cost of the operation by avoidingExpand
Unsupervised Deep Learning by Injecting Low-Rank and Sparse Priors
  • Tomoya Sakai
  • Computer Science, Engineering
  • ArXiv
  • 2021
TLDR
This work focuses on employing sparsity-inducing priors in deep learning to encourage the network to concisely capture the nature of high-dimensional data in an unsupervised way, and plugs their proximal mappings into the automatic differentiation framework. Expand
Computational methods for ice flow simulation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Zusammenfassung . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
QR and LQ Decomposition Matrix Backpropagation Algorithms for Square, Wide, and Deep -- Real or Complex -- Matrices and Their Software Implementation.
This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_{m, n}$, that are either square (m = n), wide (m n), with rank $k = min(m, n)$. Furthermore, we deriveExpand

References

SHOWING 1-10 OF 21 REFERENCES
Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
TLDR
This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity. Expand
Matrix Calculus Operations and Taylor Expansions
In problems of large dimensional complexities, matrix methods are frequently the favored mathematical tools. In this paper some extensions of matrix methods to calculus operations are introduced.Expand
AUTOMATIC DIFFERENTIATION TOOLS IN COMPUTATIONAL DYNAMICAL SYSTEMS
In this paper we describe a unified framework for the computation of power series expansions of invariant manifolds and normal forms of vector fields, and estimate the computational cost when appliedExpand
An efficient method for the numerical evaluation of partial derivatives of arbitrary order
TLDR
The key ideas are a hyperpyramid data structure and a generalized Leibniz's rule which produces any partial derivative by forming the minimum number of products (between two lower partials) together with a product of binomial coefficients. Expand
An extended collection of matrix derivative results for forward and reverse mode algorithmic dieren tiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation (AD). It highlights in particular the remarkableExpand
Collected Matrix Derivative Results for Forward and Reverse Mode Algorithmic Differentiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation. It highlights in particular the remarkableExpand
Algorithm 755: ADOL-C: a package for the automatic differentiation of algorithms written in C/C++
The C++ package ADOL-C described here facilitates the evaluation of first and higher derivatives of vector functions that are defined by computer programs written in C or C++. The resultingExpand
ALGOPY: algorithmic differentiat ion in python
  • Technical report, Humboldt- Universität zu Berlin,
  • 2009
ALGOPY: algorithmic differentiation in python
  • Technical report, Humboldt- Universität zu Berlin,
  • 2009
Evaluating Derivatives: Principles and Techniques of Algo rithmic Differentiation
  • Number 105 in Other Titles in Applied Mathematics. SIAM, Ph iladelphia, PA,
  • 2008
...
1
2
3
...