Heavy-tailed Sampling via Transformed Unadjusted Langevin Algorithm
@inproceedings{He2022HeavytailedSV, title={Heavy-tailed Sampling via Transformed Unadjusted Langevin Algorithm}, author={Ye He and Krishnakumar Balasubramanian and Murat A. Erdogdu}, year={2022} }
We analyze the oracle complexity of sampling from polynomially decaying heavy-tailed target densities based on running the Unadjusted Langevin Algorithm on certain transformed versions of the target density. The specific class of closed-form transformation maps that we construct are shown to be diffeomorphisms, and are particularly suited for developing efficient diffusion-based samplers. We characterize the precise class of heavy-tailed densities for which polynomial-order oracle complexities…
Tables from this paper
2 Citations
Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo
- Computer Science, MathematicsCOLT
- 2022
It is proved that averaged Langevin Monte Carlo outputs a sample with ε -relative Fisher information after O ( L 2 d 2 /ε 2 ) iterations, which constitutes a first step towards the general theory of non-log-concave sampling.
Fisher information lower bounds for sampling
- Computer ScienceArXiv
- 2022
We prove two lower bounds for the complexity of non-log-concave sampling within the framework of Balasubramanian et al. (2022), who introduced the use of Fisher information ( FI ) bounds as a notion…
References
SHOWING 1-10 OF 59 REFERENCES
Approximation of heavy-tailed distributions via stable-driven SDEs
- Mathematics, Computer Science
- 2020
This paper provides a rigorous theoretical framework for studying the problem of approximating heavy-tailed distributions via ergodic SDEs driven by symmetric (rotationally invariant) $\alpha$-stable processes.
Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm
- Computer ScienceCOLT
- 2021
The upper bound proof introduces a new technique based on a projection characterization of the Metropolis adjustment which reduces the study of MALA to the well-studied discretization analysis of the Langevin SDE and bypasses direct computation of the acceptance probability.
Unadjusted Langevin algorithm for sampling a mixture of weakly smooth potentials
- MathematicsBrazilian Journal of Probability and Statistics
- 2022
The problem of sampling through Euler discretization, where the potential function is assumed to be a mixture of weakly smooth distributions and satisfies weakly dissipative, is studied and convergence guarantees under Poincaré inequality or non-strongly convex outside the ball are proved.
Analysis of Langevin Monte Carlo from Poincare to Log-Sobolev
- Computer Science, MathematicsCOLT
- 2022
This work provides the first R ´ enyi divergence convergence guarantees for LMC which allow for weak smoothness and do not require convexity or dissipativity conditions, and introduces techniques for bounding error terms under a certain change of measure, which is a new feature in R´enyi analysis.
User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
- Computer ScienceStochastic Processes and their Applications
- 2019
Non-Asymptotic Analysis of Fractional Langevin Monte Carlo for Non-Convex Optimization
- Computer ScienceICML
- 2019
The non-asymptotic behavior of FLMC for non-convex optimization is analyzed and finite-time bounds for its expected suboptimality are proved and the results show that the weak-error ofFLMC increases faster than LMC, which suggests using smaller step-sizes in FLMC.
Nonasymptotic bounds for sampling algorithms without log-concavity
- Computer Science, MathematicsThe Annals of Applied Probability
- 2020
It is revealed that the variance of the randomised drift does not influence the rate of weak convergence of the Euler scheme to the SDE, and non-asymptotic bounds on the distance between the laws induced by Euler schemes and the invariant laws of SDEs are derived.
Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices
- Mathematics, Computer ScienceNeurIPS
- 2019
A convergence guarantee in Kullback-Leibler (KL) divergence is proved assuming $\nu$ satisfies a log-Sobolev inequality and the Hessian of $f$ is bounded.
Analysis of Langevin Monte Carlo via Convex Optimization
- Computer ScienceJ. Mach. Learn. Res.
- 2019
It is shown that the Unadjusted Langevin Algorithm can be formulated as a first order optimization algorithm of an objective functional defined on the Wasserstein space of order $2$ and a non-asymptotic analysis of this method to sample from logconcave smooth target distribution is given.
On the Ergodicity, Bias and Asymptotic Normality of Randomized Midpoint Sampling Method
- Computer Science, MathematicsNeurIPS
- 2020
This paper describes the stationary distribution of the discrete chain obtained with constant step-size discretization and shows that it is biased away from the target distribution, and establishes the asymptotic normality for numerical integration using the randomized midpoint method.