• Publications
  • Influence
Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
TLDR
This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity.
Algorithm 799: revolve: an implementation of checkpointing for the reverse or adjoint mode of computational differentiation
TLDR
This article presents the function revolve, which generates checkpointing schedules that are provably optimal with regard to a primary and a secondary criterion and is intended to be used as an explicit “controller” for running a time-dependent applications program.
Getting Started with ADOL-C
  • A. Walther
  • Computer Science
    Combinatorial Scientific Computing
  • 2009
TLDR
This tutorial describes the source code modification required for the application of ADOL-C, the most frequently used drivers to evaluate derivatives and some recent developments.
On constrained optimization by adjoint based quasi-Newton methods
TLDR
A new approach to constrained optimization that is based on direct and adjoint vector-function evaluations in combination with secant updating is proposed, which avoids the avoidance of constraint Jacobian evaluations and the reduction of the linear algebra cost per iteration in the dense, unstructured case.
Evaluating higher derivative tensors by forward propagation of univariate Taylor series
TLDR
With the approach presented, much simpler data access patterns and similar or lower computational counts can be achieved through propagating a family of univariate Taylor series of a suitable degree.
Automatic differentiation of explicit Runge-Kutta methods for optimal control
  • A. Walther
  • Mathematics, Computer Science
    Comput. Optim. Appl.
  • 2007
TLDR
This paper presents the integration schemes that are automatically generated when differentiating the discretization of the state equation using Automatic Differentiation (AD), and shows that they can be seen as discretized methods for the and adjoint differential equation of the underlying control problem.
Computing sparse Hessians with automatic differentiation
A new approach for computing a sparsity pattern for a Hessian is presented: nonlinearity information is propagated through the function evaluation yielding the nonzero structure. A complexity
An adjoint-based SQP algorithm with quasi-Newton Jacobian updates for inequality constrained optimization
We present a sequential quadratic programming (SQP) type algorithm, based on quasi-Newton approximations of Hessian and Jacobian matrices, which is suitable for the solution of general nonlinear
Efficient Computation of Sparse Hessians Using Coloring and Automatic Differentiation
TLDR
The experimental results show that sparsity exploitation via coloring yields enormous savings in runtime and makes the computation of Hessians of very large size feasible and the results also show that evaluating a Hessian via an indirect method is often faster than a direct evaluation.
...
1
2
3
4
5
...