A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning

@article{Srajer2018ABO,
  title={A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning},
  author={F. Srajer and Z. Kukelova and A. Fitzgibbon},
  journal={Optimization Methods and Software},
  year={2018},
  volume={33},
  pages={889 - 906}
}
Algorithmic differentiation (AD) allows exact computation of derivatives given only an implementation of an objective function. Although many AD tools are available, a proper and efficient implementation of AD methods is not straightforward. The existing tools are often too different to allow for a general test suite. In this paper, we compare 15 ways of computing derivatives including 11 automatic differentiation tools implementing various methods and written in various languages (C++, F… Expand
Vector Forward Mode Automatic Differentiation on SIMD/SIMT architectures
TLDR
It is demonstrated that the forward mode can outperform the reverse mode for programs with tens or hundreds of directional derivatives, a number that may yet increase if current hardware trends continue. Expand
Computation of higher order Lie derivatives on the Infinity Computer
TLDR
A novel approach for calculating the Lie derivative of a function, even in the case where its analytical expression is not available, that is based on the Infinity Computer arithmetic is presented. Expand
Julia Language in Machine Learning: Algorithms, Applications, and Open Issues
TLDR
This paper summarizes the related research work and developments in the application of the Julia language in machine learning, and investigates applications of the machine learning algorithms implemented with theJulia language. Expand
Instead of Rewriting Foreign Code for Machine Learning, Automatically Synthesize Fast Gradients
TLDR
Enzyme synthesizes gradients for programs written in any language whose compiler targets LLVM IR including C, C++, Fortran, Julia, Rust, Swift, MLIR, etc., thereby providing native AD capabilities in these languages. Expand
A Simulink-based software solution using the Infinity Computer methodology for higher order differentiation
TLDR
A new module has been implemented to achieve higher order Lie derivatives embedded in the numerical solution to Ordinary Differential Equations within the Simulink-based Infinity Computer solution, recently introduced by the authors. Expand
Efficient differentiable programming in a functional array-processing language
TLDR
In combination, gradient computation with forward-mode AD can be as efficient as reverse mode, and that the Jacobian matrices required for numerical algorithms such as Gauss-Newton and Levenberg-Marquardt can be efficiently computed. Expand
Remarks on stochastic automatic adjoint differentiation and financial models calibration
TLDR
It is demonstrated that the AAD allows a perfect SIMD\footnote{Single Input Multiple Data} parallelization and provide its relative computational cost. Expand
Digital image recognition based on Fractional-order-PCA-SVM coupling algorithm
TLDR
The Fractional-order-PCA-SVM coupling algorithm designed in this paper is effective and meets the demand of identification. Expand
Hashing modulo alpha-equivalence
TLDR
A new, asymptotically efficient way to hash modulo alpha-equivalence by using a weak (commutative) hash combiner at exactly one point in the construction, which admits an algorithm with O(n (logn)2) time complexity. Expand
Flexible and efficient optimization of quantitative sequences using automatic differentiation of Bloch simulations
To investigate a computationally efficient method for optimizing the Cramér‐Rao Lower Bound (CRLB) of quantitative sequences without using approximations or an analytical expression of the signal.
...
1
2
...

References

SHOWING 1-10 OF 29 REFERENCES
Automatic differentiation in machine learning: a survey
TLDR
By precisely defining the main differentiation techniques and their interrelationships, this work aims to bring clarity to the usage of the terms “autodiff’, “automatic differentiation”, and “symbolic differentiation" as these are encountered more and more in machine learning settings. Expand
Evaluating derivatives - principles and techniques of algorithmic differentiation, Second Edition
TLDR
This second edition has been updated and expanded to cover recent developments in applications and theory, including an elegant NP completeness argument by Uwe Naumann and a brief introduction to scarcity, a generalization of sparsity. Expand
Fast Reverse-Mode Automatic Differentiation using Expression Templates in C++
  • R. Hogan
  • Computer Science
  • ACM Trans. Math. Softw.
  • 2014
TLDR
A fast new operator-overloading method is presented that uses the expression template programming technique in C++ to provide a compile-time representation of each mathematical expression as a computational graph that can be efficiently traversed in either direction. Expand
Comparison of Automatic and Symbolic Differentiation in Mathematical Modeling and Computer Simulation of Rigid-Body Systems
TLDR
ADOLC was superior to MACSYMA concerning preliminary work and modifications of the functions to be differentiated and the symbolicdifferentiation task to be performed, and contrary toMACSYMA, no limits of applicability were observed for ADOLC, even in the simulation of complex multi-body systems. Expand
Combining source transformation and operator overloading techniques to compute derivatives for MATLAB programs
TLDR
A novel software tool is proposed to automatically transform a given MATLAB program into another MATLab program capable of computing not only the original function but also user-specified derivatives of that function. Expand
An efficient overloaded method for computing derivatives of mathematical functions in MATLAB
TLDR
An object-oriented method is presented that computes without truncation the error derivatives of functions defined by MATLAB computer codes and has the feature that the derivatives are generated by simply evaluating the function on an instance of the class, thus making the method straightforward to use while simultaneously enabling differentiation of highly complex functions. Expand
Forward-Mode Automatic Differentiation in Julia
TLDR
ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support for higher-order differentiation and differentiation using custom number types. Expand
The Tapenade automatic differentiation tool: Principles, model, and specification
TLDR
The principles of Tapenade are described, a subset of the general principles of AD, and the extensions of the tool that are planned in a foreseeable future are presented, deriving from the ongoing research on AD. Expand
Bundle Adjustment in the Large
TLDR
The experiments show that truncated Newton methods, when paired with relatively simple preconditioners, offer state of the art performance for large-scale bundle adjustment. Expand
From learning models of natural image patches to whole image restoration
TLDR
A generic framework which allows for whole image restoration using any patch based prior for which a MAP (or approximate MAP) estimate can be calculated is proposed and a generic, surprisingly simple Gaussian Mixture prior is presented, learned from a set of natural images. Expand
...
1
2
3
...