Training restricted Boltzmann machines: An introduction

  title={Training restricted Boltzmann machines: An introduction},
  author={Asja Fischer and Christian Igel},
  journal={Pattern Recognition},
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpoint of Markov random fields, starting with the required concepts of undirected graphical models… CONTINUE READING
Highly Influential
This paper has highly influenced 13 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 162 citations. REVIEW CITATIONS
94 Citations
62 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 94 extracted citations

162 Citations

Citations per Year
Semantic Scholar estimates that this publication has 162 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 62 references

Learning deep architectures for AI

  • Y. Bengio
  • Foundations and Trends in Machine Learning, 21(6…
  • 2009
Highly Influential
6 Excerpts


  • C. Igel, T. Glasmachers, V. Heidrich-Meisner
  • Journal of Machine Learning Research, 9:993– 996,
  • 2008
Highly Influential
3 Excerpts

Similar Papers

Loading similar papers…