Pricing options on flow forwards by neural networks in Hilbert space

@article{Benth2022PricingOO,
  title={Pricing options on flow forwards by neural networks in Hilbert space},
  author={Fred Espen Benth and Nils Detering and Luca Galimberti},
  journal={ArXiv},
  year={2022},
  volume={abs/2202.11606}
}
We propose a new methodology for pricing options on flow forwards by applying infinite-dimensional neural networks. We recast the pricing problem as an optimization problem in a Hilbert space of real-valued function on the positive real line, which is the state space for the term structure dynamics. This optimization problem is solved by facilitating a novel feedforward neural network architecture designed for approximating continuous functions on the state space. The proposed neural net is… 
1 Citations

Figures from this paper

Neural Networks in Fr\'echet spaces
TLDR
It is shown that the resulting neural networks on infinite dimen- sional spaces can be projected down to flnite dimensional subspaces with any desirable accuracy, thus obtaining approximating networks that are easy to implement and allow for fast computation and fitting.

References

SHOWING 1-10 OF 28 REFERENCES
Representation of Infinite-Dimensional Forward Price Models in Commodity Markets
We study the forward price dynamics in commodity markets realised as a process with values in a Hilbert space of absolutely continuous functions defined by Filipović (Consistency problems for
Accuracy of deep learning in calibrating HJM forward curves
TLDR
A new class of volatility operators is introduced which map the square integrable noise into the Filipovi\'{c} space of forward curves, and a deterministic parametrized version of it is specified.
Neural Networks for Option Pricing and Hedging: A Literature Review
TLDR
This note intends to provide a comprehensive review of neural networks as a nonparametric method for option pricing and hedging since the early 1990s in terms of input features, output variables, benchmark models, performance measures, data partition methods, and underlying assets.
Deep calibration of rough stochastic volatility models
TLDR
This work showcases a direct comparison of different potential approaches to the learning stage and presents algorithms that provide a suffcient accuracy for practical use and provides a first neural network-based calibration method for rough volatility models for which calibration can be done on the y.
Deep ReLU Network Expression Rates for Option Prices in high-dimensional, exponential Lévy models
TLDR
Under stronger, dimension-uniform nondegeneracy conditions on the Lévy symbol, algebraic expression rates of option prices in exponential LÉvy models which are free from the curse of dimensionality are obtained.
Approximating Lévy Semistationary Processes via Fourier Methods in the Context of Power Markets
TLDR
This paper introduces and analyzes a Fourier simulation scheme for obtaining trajectories of Levy semistationary processes in an iterative manner, and demonstrates that the proposed scheme is well suited for simulation of a wide range of LSS processes, including, in particular, L SS processes indexed by a kernel function which is steep close to the origin.
An overview on deep learning-based approximation methods for partial differential equations
TLDR
An introduction to this field of research, some of the main ideas of deep learning-based approximation methods for PDEs, a revisit one of the central mathematical results for deep neural network approximations for P DEs, and an overview of the recent literature is provided.
Solving high-dimensional partial differential equations using deep learning
TLDR
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.
A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
TLDR
It is proved for the first time that in the case of semilinear heat equations with gradient-independent nonlinearities that the numbers of parameters of the employed deep neural networks grow at most polynomially in both the PDE dimension and the reciprocal of the prescribed approximation accuracy.
Jump-diffusions in Hilbert spaces: existence, stability and numerics
By means of an original approach, called ‘method of the moving frame’, we establish existence, uniqueness and stability results for mild and weak solutions of stochastic partial differential
...
...