Conghui Tan

We don’t have enough information about this author to calculate their statistics. If you think this is an error let us know.
Learn More
One of the major issues in stochastic gradient descent (SGD) methods is how to choose an appropriate step size while running the algorithm. Since the traditional line search technique does not apply for stochastic optimization algorithms, the common practice in SGD is either to use a diminishing step size, or to tune a fixed step size by hand. Apparently,(More)
  • 1