Fast Sampling from Time-Integrated Bridges using Deep Learning
@inproceedings{Perotti2021FastSF, title={Fast Sampling from Time-Integrated Bridges using Deep Learning}, author={Leonardo Perotti and Lech A. Grzelak}, year={2021} }
We propose a methodology to sample from time-integrated stochastic bridges, namely random variables defined as ∫ t2 t1 f(Y (t))dt conditioned on Y (t1) =a and Y (t2) = b, with a, b ∈ R. The techniques developed in [8] – the Stochastic Collocation Monte Carlo sampler – and in [14] – the Seven-League scheme – are applied for this purpose. Notably, the distribution of the timeintegrated bridge is approximated utilizing a polynomial chaos expansion built on a suitable set of stochastic collocation…
Figures and Tables from this paper
References
SHOWING 1-10 OF 20 REFERENCES
The Seven-League Scheme: Deep learning for large time step Monte Carlo simulations of stochastic differential equations
- Computer ScienceRisks
- 2022
Basic error analysis indicates that this data-driven scheme results in accurate SDE solutions in the sense of strong convergence, provided the learning methodology is robust and accurate, and the novel scheme outperforms some classical numerical SDE discretizations.
Solving high-dimensional partial differential equations using deep learning
- Computer ScienceProceedings of the National Academy of Sciences
- 2018
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Computer ScienceJ. Comput. Phys.
- 2019
The Stochastic Collocation Monte Carlo Sampler: Highly Efficient Sampling from 'Expensive' Distributions
- Mathematics
- 2019
In this article, we propose an efficient approach for inverting computationally expensive cumulative distribution functions. A collocation method, called the Stochastic Collocation Monte Carlo…
Adam: A Method for Stochastic Optimization
- Computer ScienceICLR
- 2015
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Stochastic Bridges of Linear Systems
- MathematicsIEEE Transactions on Automatic Control
- 2016
A stochastic differential equation (SDE) is constructed that generates a bridge that agrees with the statistics of the conditioned process, as a degenerate diffusion and higher order linear diffusions are considered.
On an efficient multiple time step Monte Carlo simulation of the SABR model
- Computer Science
- 2016
The present multiple time step Monte Carlo simulation technique for pricing options under the Stochastic Alpha Beta Rho model is especially useful for long-term options and for exotic options.
On unbiased simulations of stochastic bridges conditioned on extrema
- Mathematics
- 2019
Two algorithms for generating Brownian bridges constrained to a given extremum are compared, and one of which generalises to other diffusions and the case of drift is considered, and applications to geometric Brownian motions are considered.
Activation Functions: Comparison of trends in Practice and Research for Deep Learning
- Computer ScienceArXiv
- 2018
This paper will be the first, to compile the trends in AF applications in practice against the research results from literature, found in deep learning research to date.
Exact Simulation of Stochastic Volatility and Other Affine Jump Diffusion Processes
- Computer ScienceOper. Res.
- 2006
This paper suggests a method for the exact simulation of the stock price and variance under Hestons stochastic volatility model and other affine jump diffusion processes and achieves an O(s-1/2) convergence rate, where s is the total computational budget.