Differentially Private Variance Reduced Stochastic Gradient Descent

  • Jaewoo Lee
  • Published 2017 in
    2017 International Conference on New Trends in…
In this paper, we propose a differentially private stochastic variance reduced gradient algorithm, called DP-SVRG. To privatize SVRG algorithm, we randomize the gradient computation process by injecting random noise.There are two main challenges in this approach: (i) high variance of stochastic gradient updates, and (ii) low per-iteration privacy budget. To… CONTINUE READING