• Corpus ID: 245650489

On automatic differentiation for the Mat\'ern covariance

@inproceedings{Marin2022OnAD,
  title={On automatic differentiation for the Mat\'ern covariance},
  author={Oana Marin and Christopher J. Geoga and Michel Schanen},
  year={2022}
}
To target challenges in differentiable optimization we analyze and propose strategies for derivatives of the Matérn kernel with respect to the smoothness parameter. This problem is of high interest in Gaussian processes modelling due to the lack of robust derivatives of the modified Bessel function of second kind with respect to order. In the current work we focus on newly identified series expansions for the modified Bessel function of second kind valid for complex orders. Using these… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 18 REFERENCES

Automatic Differentiation Through the Use of Hyper-Dual Numbers for Second Derivatives

TLDR
One particular number system is developed, termed hyper-dual numbers, which produces exact first- and second-derivative information, which is demonstrated on an unstructured, parallel, unsteady Reynolds-Averaged Navier-Stokes solver.

Derivative convergence for iterative equation solvers

When nonlinear equation solvers are applied to parameter-dependent problems, their iterates can be interpreted as functions of these variable parameters. The derivatives (if they exist) of these

Special function neural network (SFNN) models

  • Yuzhen LiuO. Marin
  • Mathematics, Computer Science
    2021 IEEE International Conference on Cluster Computing (CLUSTER)
  • 2021
TLDR
Neural network models are developed to be a stand-in for special functions, focusing on the Bessel functions of the first and second kind, and corresponding derivatives.

Higher-order automatic differentiation of mathematical functions

Error Bounds for the Large-Argument Asymptotic Expansions of the Hankel and Bessel Functions

In this paper, we reconsider the large-argument asymptotic expansions of the Hankel, Bessel and modified Bessel functions and their derivatives. New integral representations for the remainder terms

Reverse accumulation and attractive fixed points

TLDR
It is shown how to re-use the computational graph for the fixed point constructor Φ so as to set explicit stopping criteria for the iterations, based on the gradient accuracy required, which allows the gradient vector to be obtained to the same order of accuracy as the objective function values.

Interpolation of Spatial Data: Some Theory for Kriging

TLDR
This chapter discusses the role of asymptotics for BLPs, and applications of equivalence and orthogonality of Gaussian measures to linear prediction, and the importance of Observations not part of a sequence.

Arbogast: Higher order automatic differentiation for special functions with Modular C

This high-level toolbox for the calculus with Taylor polynomials is named after L.F.A. Arbogast (1759–1803), a French mathematician from Strasbourg (Alsace), for his pioneering work in derivation

Automatic differentiation and iterative processes

We identify a class of iterative processes that can be used in the definition of a function while preserving the good behavior of automatic differentiation codes on this function. By iterative proc...

An Overview of Software Development for Special Functions

TLDR
This paper concentrates on the third step from the viewpoint of a numerical analyst working on software for elementary and special functions.