## Asynchronous gossip principal components analysis

- Jérôme Fellus, David Picard, Philippe Henri Gosselin
- Neurocomputing
- 2015

@article{Blot2018GoSGDDO, title={GoSGD: Distributed Optimization for Deep Learning with Gossip Exchange}, author={Michael Blot and David Picard and Matthieu Cord}, journal={CoRR}, year={2018}, volume={abs/1804.01852} }

- Published 2018 in ArXiv

We address the issue of speeding up the training of convolutional neural networks by studying a distributed method adapted to stochastic gradient descent. Our parallel optimization setup uses several threads, each applying individual gradient descents on a local variable. We propose a new way of sharing information between different threads based on gossip… CONTINUE READING

### Presentations referencing similar topics