Deep Boltzmann Machines

Abstract

We present a new learning algorithm for Boltzmann machines that contain many layers of hidden variables. Data-dependent expectations are estimated using a variational approximation that tends to focus on a single mode, and dataindependent expectations are approximated using persistent Markov chains. The use of two quite different techniques for estimating the two types of expectation that enter into the gradient of the log-likelihood makes it practical to learn Boltzmann machines with multiple hidden layers and millions of parameters. The learning can be made more efficient by using a layer-by-layer “pre-training” phase that allows variational inference to be initialized with a single bottomup pass. We present results on the MNIST and NORB datasets showing that deep Boltzmann machines learn good generative models and perform well on handwritten digit and visual object recognition tasks.

Extracted Key Phrases

6 Figures and Tables

0100200200920102011201220132014201520162017
Citations per Year

987 Citations

Semantic Scholar estimates that this publication has 987 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Salakhutdinov2009DeepBM, title={Deep Boltzmann Machines}, author={Ruslan Salakhutdinov and Geoffrey E. Hinton}, booktitle={AISTATS}, year={2009} }