Corpus ID: 222080194

Variance reduction for Random Coordinate Descent-Langevin Monte Carlo.

@article{Ding2020VarianceRF,
  title={Variance reduction for Random Coordinate Descent-Langevin Monte Carlo.},
  author={Zhiyan Ding and Q. Li},
  journal={arXiv: Machine Learning},
  year={2020}
}
  • Zhiyan Ding, Q. Li
  • Published 2020
  • Mathematics, Computer Science
  • arXiv: Machine Learning
  • Sampling from a log-concave distribution function is one core problem that has wide applications in Bayesian statistics and machine learning. While most gradient free methods have slow convergence rate, the Langevin Monte Carlo (LMC) that provides fast convergence requires the computation of gradients. In practice one uses finite-differencing approximations as surrogates, and the method is expensive in high-dimensions. A natural strategy to reduce computational cost in each iteration is to… CONTINUE READING

    References

    SHOWING 1-10 OF 70 REFERENCES
    Langevin Monte Carlo: random coordinate descent and variance reduction
    • 1
    • PDF
    Variance Reduction in Stochastic Gradient Langevin Dynamics
    • 53
    • PDF
    Stochastic Gradient Hamiltonian Monte Carlo
    • 454
    • Highly Influential
    • PDF
    Stochastic Gradient Hamiltonian Monte Carlo Methods with Recursive Variance Reduction
    • 11
    • PDF
    Control variates for stochastic gradient MCMC
    • 47
    • PDF
    User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient
    • 122
    • Highly Influential
    • PDF
    On the Theory of Variance Reduction for Stochastic Gradient Monte Carlo
    • 55
    • Highly Influential
    • PDF
    Dimensionally Tight Bounds for Second-Order Hamiltonian Monte Carlo
    • 27
    • PDF
    A Complete Recipe for Stochastic Gradient MCMC
    • 242
    • PDF