The "wake-sleep" algorithm for unsupervised neural networks.

@article{Hinton1995TheA,
  title={The "wake-sleep" algorithm for unsupervised neural networks.},
  author={Geoffrey E. Hinton and Peter Dayan and Brendan J. Frey and R M Neal},
  journal={Science},
  year={1995},
  volume={268 5214},
  pages={
          1158-61
        }
}
An unsupervised learning algorithm for a multilayer network of stochastic neurons is described. Bottom-up "recognition" connections convert the input into representations in successive hidden layers, and top-down "generative" connections reconstruct the representation in one layer from the representation in the layer above. In the "wake" phase, neurons are driven by recognition connections, and generative connections are adapted to increase the probability that they would reconstruct the… 

A simple algorithm that discovers efficient perceptual codes

The \wake-sleep" algorithm that allows a multilayer, unsupervised, neural network to build a hierarchy of representations of sensory input is described, which is driven top-down by the generative connections to produce a fantasized representation and a fantasizing sensory input.

Does the Wake-sleep Algorithm Produce Good Density Estimators?

This work uses a variety of synthetic and real data sets to compare the performance of the wake-sleep algorithm with Monte Carlo and mean field methods for fitting the same generative model and also compares it with other models that are less powerful but easier to fit.

A wake-sleep algorithm for recurrent, spiking neural networks

A conceptually very simple "wake-sleep" algorithm is introduced: during the wake phase, training is executed normally, while during the sleep phase, the network "dreams" samples from its generative model, which are induced by random input.

Biologically inspired sleep algorithm for artificial neural networks

Biological sleep can help mitigate a number of problems ANNs suffer from, such as poor generalization and catastrophic forgetting for incremental learning, and improve generalization ability of the ANNs to classify images with various types of noise.

Pre-synaptic lateral inhibition provides a better architecture for self-organizing neural networks.

It is shown that a self-organizing neural network architecture using pre-synaptic lateral inhibition enables a single learning algorithm to find distributed, local, and topological representations as appropriate to the structure of the input data received.

B IOLOGICALLY INSPIRED SLEEP ALGORITHM FOR ARTIFICIAL NEURAL NETWORKS ∗

Biological sleep can help mitigate a number of problems ANNs suffer from, such as poor generalization and catastrophic forgetting for incremental learning, and improve generalization ability of the ANNs to classify images with various types of noise.

An Analog VLSI Implementation of the Wake-Sleep Learning Algorithm Using BiStable Synaptic Weights

This work has designed and simulated a low power analog VLSI synapse circuit with analog but bi-stable weights that can implement the WakeSleep algorithm and demonstrated that the algorithm can be used successfully with binary synaptic weights trained in a bistable manner.

Dynamical analysis of the Wake–Sleep algorithm

The result was that the settings of the learning coefficients, particularly in the Sleep step, had a substantial effect on the convergence of the algorithm.

Factor Analysis Using Delta-Rule Wake-Sleep Learning

It is argued that the simplicity of wake-sleep learning makes factor analysis a plausible alternative to Hebbian learning as a model of activity-dependent cortical plasticity.
...

References

SHOWING 1-10 OF 20 REFERENCES

Acetylcholine and memory

Neural Computation

The nervous system is able to develop by combining on one hand a only limited amount of genetic information and, on the other hand, the input it receives, and it might be possible to develop a brain from there.

Computer Vision

Cognitive science.

This chapter will examine some of the main theoretical and experimental advances that cognitive science has accomplished over the past half century, deriving lessons that might be useful for researchers in any emerging interdisciplinary area.

Lectures in pattern theory

Trends in Neurosciences

J. Mol. Spectrosc. REFERENCES AND NOTES

  • J. Mol. Spectrosc. REFERENCES AND NOTES

The learning rates were 0.2 for the generative

  • The learning rates were 0.2 for the generative