Corpus ID: 18568112

The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use

@article{SohlDickstein2012TheNG,
  title={The Natural Gradient by Analogy to Signal Whitening, and Recipes and Tricks for its Use},
  author={Jascha Sohl-Dickstein},
  journal={ArXiv},
  year={2012},
  volume={abs/1205.1828}
}
  • Jascha Sohl-Dickstein
  • Published 2012
  • Computer Science, Mathematics
  • ArXiv
  • The natural gradient allows for more efficient gradient descent by removing dependencies and biases inherent in a function's parameterization. Several papers present the topic thoroughly and precisely. It remains a very difficult idea to get your head around however. The intent of this note is to provide simple intuition for the natural gradient and its use. We review how an ill conditioned parameter space can undermine learning, introduce the natural gradient by analogy to the more widely… CONTINUE READING
    7 Citations

    Figures and Topics from this paper.

    Revisiting Natural Gradient for Deep Networks
    • 210
    • PDF
    Natural Neural Networks
    • 123
    • PDF
    Efficient Methods for Unsupervised Learning of Probabilistic Models
    On Recurrent and Deep Neural Networks
    • 2

    References

    SHOWING 1-10 OF 17 REFERENCES
    Natural Gradient Works Efficiently in Learning
    • 2,422
    Training Products of Experts by Minimizing Contrastive Divergence
    • 3,873
    • PDF
    Minimum Probability Flow Learning
    • 56
    • PDF
    A New Learning Algorithm for Mean Field Boltzmann Machines
    • 132
    • PDF