GoSGD: Distributed Optimization for Deep Learning with Gossip Exchange

@article{Blot2018GoSGDDO,
  title={GoSGD: Distributed Optimization for Deep Learning with Gossip Exchange},
  author={Michael Blot and David Picard and Matthieu Cord},
  journal={CoRR},
  year={2018},
  volume={abs/1804.01852}
}
We address the issue of speeding up the training of convolutional neural networks by studying a distributed method adapted to stochastic gradient descent. Our parallel optimization setup uses several threads, each applying individual gradient descents on a local variable. We propose a new way of sharing information between different threads based on gossip… CONTINUE READING