• Corpus ID: 88523857

# Fractional Langevin Monte Carlo: Exploring L\'{e}vy Driven Stochastic Differential Equations for Markov Chain Monte Carlo

@inproceedings{cSimcsekli2017FractionalLM,
title={Fractional Langevin Monte Carlo: Exploring L\'\{e\}vy Driven Stochastic Differential Equations for Markov Chain Monte Carlo},
author={Umut cSimcsekli},
year={2017}
}
Along with the recent advances in scalable Markov Chain Monte Carlo methods, sampling techniques that are based on Langevin diffusions have started receiving increasing attention. These so called Langevin Monte Carlo (LMC) methods are based on diffusions driven by a Brownian motion, which gives rise to Gaussian proposal distributions in the resulting algorithms. Even though these approaches have proven successful in many applications, their performance can be limited by the light-tailed nature…
2 Citations

## Figures and Tables from this paper

### A Tail-Index Analysis of Stochastic Gradient Noise in Deep Neural Networks

• Computer Science
ICML
• 2019
It is argued that the Gaussianity assumption might fail to hold in deep learning settings and hence render the Brownian motion-based analyses inappropriate and open up a different perspective and shed more light on the belief that SGD prefers wide minima.

### On the Heavy-Tailed Theory of Stochastic Gradient Descent for Deep Neural Networks

• Computer Science
ArXiv
• 2019
It is argued that the Gaussianity assumption might fail to hold in deep learning settings and hence render the Brownian motion-based analyses inappropriate and establish an explicit connection between the convergence rate of SGD to a local minimum and the tail-index $\alpha$.

## References

SHOWING 1-10 OF 45 REFERENCES

### Stochastic Quasi-Newton Langevin Monte Carlo

• Computer Science
ICML
• 2016
This study proposes a novel SG-MCMC method that takes the local geometry into account by using ideas from Quasi-Newton optimization methods, and achieves fast convergence rates similar to Riemannian approaches while at the same time having low computational requirements similar to diagonal preconditioning approaches.

### Stochastic Gradient Richardson-Romberg Markov Chain Monte Carlo

• Computer Science
NIPS
• 2016
A novel sampling algorithm that aims to reduce the bias of SG-MCMC while keeping the variance at a reasonable level is proposed and it is shown that SGRRLD is asymptotically consistent, satisfies a central limit theorem, and its non-asymptotic bias and the mean squared-error can be bounded.

### Convergence of Heavy‐tailed Monte Carlo Markov Chain Algorithms

• Mathematics, Computer Science
• 2007
The paper gives for the first time a theoretical justification for the common belief that heavy‐tailed proposal distributions improve convergence in the context of random‐walk Metropolis algorithms.

### Lévy-Driven Langevin Systems: Targeted Stochasticity

• Mathematics
• 2003
Langevin dynamics driven by random Wiener noise (“white noise”), and the resulting Fokker–Planck equation and Boltzmann equilibria are fundamental to the understanding of transport and relaxation.

### On the Convergence of Stochastic Gradient MCMC Algorithms with High-Order Integrators

• Computer Science
NIPS
• 2015
This paper considers general SG-MCMCs with high-order integrators, and develops theory to analyze finite-time convergence properties and their asymptotic invariant measures.

### Langevin-Type Models II: Self-Targeting Candidates for MCMC Algorithms*

• Computer Science, Mathematics
• 1999
It is shown that the class of candidate distributions, developed in Part I, which “self-target” towards the high density areas of π, produce Metropolis-Hastings algorithms with convergence rates that appear to be considerably better than those known for the traditional candidate choices, such as random walk.

### Langevin Diffusions and Metropolis-Hastings Algorithms

• Mathematics
• 2002
We consider a class of Langevin diffusions with state-dependent volatility. The volatility of the diffusion is chosen so as to make the stationary distribution of the diffusion with respect to its

### Recursive computation of the invariant measure of a stochastic differential equation driven by a Lévy process

We investigate some recursive procedures based on an exact or approximate'' Euler scheme with decreasing step in vue to computation of invariant measures of solutions to S.D.E. driven by a Levy

### Non-asymptotic convergence analysis for the Unadjusted Langevin Algorithm

• Mathematics, Computer Science
• 2015
For both constant and decreasing step sizes in the Euler discretization, non-asymptotic bounds for the convergence to the target distribution $\pi$ in total variation distance are obtained.

### Stochastic thermodynamic integration: Efficient Bayesian model selection via stochastic gradient MCMC

• Computer Science
2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
• 2016
This study proposes a computationally efficient model selection method by integrating ideas from Stochastic Gradient Markov Chain Monte Carlo literature and statistical physics, which has very low computational needs and can be implemented almost without modifying existing SG-MCMC code.