# Variance reduction for Random Coordinate Descent-Langevin Monte Carlo.

@article{Ding2020VarianceRF, title={Variance reduction for Random Coordinate Descent-Langevin Monte Carlo.}, author={Zhiyan Ding and Q. Li}, journal={arXiv: Machine Learning}, year={2020} }

Sampling from a log-concave distribution function is one core problem that has wide applications in Bayesian statistics and machine learning. While most gradient free methods have slow convergence rate, the Langevin Monte Carlo (LMC) that provides fast convergence requires the computation of gradients. In practice one uses finite-differencing approximations as surrogates, and the method is expensive in high-dimensions. A natural strategy to reduce computational cost in each iteration is to… CONTINUE READING

#### Topics from this paper.

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 67 REFERENCES

Accelerating Stochastic Gradient Descent using Predictive Variance Reduction

- Mathematics, Computer Science
- 2013

- 1,535
- PDF

Stochastic Gradient Hamiltonian Monte Carlo

- Computer Science, Mathematics
- 2014

- 431
- Highly Influential
- PDF

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

- Mathematics, Computer Science
- 1984

- 11,727
- PDF

Langevin Monte Carlo: random coordinate descent and variance reduction

- Mathematics, Computer Science
- 2020

- 1
- PDF

User-friendly guarantees for the Langevin Monte Carlo with inaccurate gradient

- Mathematics, Computer Science
- 2017

- 118
- Highly Influential
- PDF