Supervised Deep Neural Networks (DNNs) for Pricing/Calibration of Vanilla/Exotic Options Under Various Different Processes
@article{Hirsa2019SupervisedDN, title={Supervised Deep Neural Networks (DNNs) for Pricing/Calibration of Vanilla/Exotic Options Under Various Different Processes}, author={Ali Hirsa and Tugce Karatas and Amir Oskoui}, journal={ArXiv}, year={2019}, volume={abs/1902.05810} }
We apply supervised deep neural networks (DNNs) for pricing and calibration of both vanilla and exotic options under both diffusion and pure jump processes with and without stochastic volatility. We train our neural network models under different number of layers, neurons per layer, and various different activation functions in order to find which combinations work better empirically. For training, we consider various different loss functions and optimization routines. We demonstrate that deep…
Figures and Tables from this paper
21 Citations
Solving barrier options under stochastic volatility using deep learning
- Computer Science
- 2022
We develop an unsupervised deep learning method to solve the barrier options under the Bergomi model. The neural networks serve as the approximate option surfaces and are trained to satisfy the PDE…
The CV Makes the Difference – Control Variates for Neural Networks
- Computer Science
- 2020
This technique improves the quality of deep learning applied to option pricing problems and increases the accuracy of applying neural nets since a large portion of the price is already mimicked by the control variate.
An unsupervised deep learning approach to solving partial integro-differential equations
- Computer ScienceQuantitative Finance
- 2022
This work investigates solving partial integro-differential equations (PIDEs) using unsupervised deep learning and employs a neural network as the candidate solution and trains the neural network to satisfy the PIDE.
FX Volatility Calibration Using Artificial Neural Networks
- Computer Science
- 2020
This dissertation considers Heston’s stochastic volatility model, and demonstrates how the calibration map from quoted implied volatilities to model parameters can be effectively learned using an ANN, and explores the possibility of approximating the leverage function using a series of ANNs.
Deep Option Pricing - Term Structure Models
- Computer ScienceSSRN Electronic Journal
- 2019
This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options within the setting of interest rate term structure models. This aims to…
A neural network-based framework for financial model calibration
- Computer ScienceJournal of Mathematics in Industry
- 2019
The rapid on-line learning of implied volatility by ANNs, in combination with the use of an adapted parallel global optimization method, tackles the computation bottleneck and provides a fast and reliable technique for calibrating model parameters while avoiding, as much as possible, getting stuck in local minima.
Option Pricing With Machine Learning
- EconomicsSSRN Electronic Journal
- 2019
Some of the existing methods using neural networks for pricing market and model prices, present calibration, and introduce exotic option pricing are reviewed, the feasibility of these methods, highlight problems, and propose alternative solutions are discussed.
Extensive networks would eliminate the demand for pricing formulas
- Computer ScienceKnowl. Based Syst.
- 2022
Machine Learning Solutions to Challenges in Finance: An Application to the Pricing of Financial Products
- EconomicsTechnological Forecasting and Social Change
- 2020
This paper proposes a machine-learning method to price arithmetic and geometric average options accurately and in particular quickly and it is verified by empirical applications as well as numerical experiments.
Neural Networks Based Dynamic Implied Volatility Surface
- EconomicsSSRN Electronic Journal
- 2019
A pricing model is tied to its ability of capturing the dynamics of the spot price process. Its misspecification will lead to pricing and hedging errors. Parametric pricing formula depends on the…
References
SHOWING 1-10 OF 29 REFERENCES
Machine Learning in Finance : The Case of Deep Learning for Option Pricing
- Computer Science
- 2017
A framework within which machine learning may be used for finance, with specific application to option pricing is summarized, and a fully-connected feed-forward deep learning neural network is trained to reproduce the Black and Scholes (1973) option pricing formula to a high degree of accuracy.
Machine learning for quantitative finance: fast derivative pricing, hedging and fitting
- Computer ScienceQuantitative Finance
- 2018
It is illustrated that for many classical problems, the price of extra speed is some loss of accuracy, but this reduced accuracy is often well within reasonable limits and hence very acceptable from a practical point of view.
A Convergence Theory for Deep Learning via Over-Parameterization
- Computer ScienceICML
- 2019
This work proves why stochastic gradient descent can find global minima on the training objective of DNNs in $\textit{polynomial time}$ and implies an equivalence between over-parameterized neural networks and neural tangent kernel (NTK) in the finite (and polynomial) width setting.
Convergence Analysis of Two-layer Neural Networks with ReLU Activation
- Computer ScienceNIPS
- 2017
A convergence analysis for SGD is provided on a rich subset of two-layer feedforward networks with ReLU activations characterized by a special structure called "identity mapping" that proves that, if input follows from Gaussian distribution, with standard $O(1/\sqrt{d})$ initialization of the weights, SGD converges to the global minimum in polynomial number of steps.
Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- Computer ScienceIEEE Transactions on Information Theory
- 2019
It is shown that with the quadratic activations, the optimization landscape of training, such shallow neural networks, has certain favorable characteristics that allow globally optimal models to be found efficiently using a variety of local search heuristics.
No bad local minima: Data independent training error guarantees for multilayer neural networks
- Computer ScienceArXiv
- 2016
It is proved that for a MNN with one hidden layer, the training error is zero at every differentiable local minimum, for almost every dataset and dropout-like noise realization, and extended to the case of more than onehidden layer.
Wide Residual Networks
- Computer ScienceBMVC
- 2016
This paper conducts a detailed experimental study on the architecture of ResNet blocks and proposes a novel architecture where the depth and width of residual networks are decreased and the resulting network structures are called wide residual networks (WRNs), which are far superior over their commonly used thin and very deep counterparts.
Understanding the difficulty of training deep feedforward neural networks
- Computer ScienceAISTATS
- 2010
The objective here is to understand better why standard gradient descent from random initialization is doing so poorly with deep neural networks, to better understand these recent relative successes and help design better algorithms in the future.
Pricing American options under variance gamma
- Mathematics
- 2003
We derive a form of the partial integro-differential equation (PIDE) for pricing American options under variance gamma (VG) process. We then develop a numerical algorithm to solve for values of…
Effect of Depth and Width on Local Minima in Deep Learning
- Computer ScienceNeural Computation
- 2019
With a locally induced structure on deep nonlinear neural networks, the values of local minima of neural networks are theoretically proven to be no worse than the globally optimal values of corresponding classical machine learning models.