Gaussian-binary restricted Boltzmann machines for modeling natural image statistics

@article{Melchior2017GaussianbinaryRB,
  title={Gaussian-binary restricted Boltzmann machines for modeling natural image statistics},
  author={Jan Melchior and Nan Wang and Laurenz Wiskott},
  journal={PLoS ONE},
  year={2017},
  volume={12}
}
We present a theoretical analysis of Gaussian-binary restricted Boltzmann machines (GRBMs) from the perspective of density models. The key aspect of this analysis is to show that GRBMs can be formulated as a constrained mixture of Gaussians, which gives a much better insight into the model’s capabilities and limitations. We further show that GRBMs are capable of learning meaningful features without using a regularization term and that the results are comparable to those of independent component… 
A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines
TLDR
This work derives a deterministic framework for the training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer (TAP) mean-field approximation of widely-connected systems with weak interactions coming from spin-glass theory.
Background subtraction using Gaussian-Bernoulli restricted Boltzmann machine
TLDR
This work proposes a novel background subtraction method based on Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) and presents a simple technique to reconstruct the learned background model from a given input frame and to extract the foreground from the background using the variance learned for each pixel.
Restricted Boltzmann Machines With Gaussian Visible Units Guided by Pairwise Constraints
TLDR
This paper proposes pairwise constraints (PCs) RBM with Gaussian visible units (pcGRBM) model, in which the learning procedure is guided by PCs and the process of encoding is conducted under these guidances, to enhance the expression ability of traditional RBMs.
A new restricted boltzmann machine training algorithm for image restoration
TLDR
A new Restricted Boltzmann Machines (RBM) training algorithm for addressing corrupted data has been proposed and it is shown that the model can be used as a robust feature extractor, even for unclean data.
Learning Gaussian-Bernoulli RBMs using Difference of Convex Functions Optimization
  • V. Upadhya, P. Sastry
  • Computer Science
    IEEE transactions on neural networks and learning systems
  • 2021
TLDR
This work shows that the negative log-likelihood for a GB-RBM can be expressed as a difference of convex functions if the authors keep the variance of the conditional distribution of visible units and the biases of the visible units, constant, and proposes an S-DCP algorithm, which is better than the CD and PCD algorithms in terms of speed of learning and the quality of the generative model learned.
Salient Object Detection based on Bayesian Surprise of Restricted Boltzmann Machine
TLDR
An algorithm for salient object detection by leveraging the Bayesian surprise of the Restricted Boltzmann Machine by carrying out experiments on three datasets namely MSRA-10K, ECSSD and DUTS and shows promising performance.
A Novel Gaussian–Bernoulli Based Convolutional Deep Belief Networks for Image Feature Extraction
TLDR
The experimental results show that the proposed Gaussian–Bernoulli based Convolutional Deep Belief Network is more effective for most of images recognition tasks with comparably low computational cost than some of popular methods, which is suggested that this proposed deep network is a potentially applicable method for real-world image recognition.
Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems
TLDR
This thesis investigates the performance of Minimum Probability Flow learning for training restricted Boltzmann machines (RBMs) and proposes a more general form for the sampling dynamics in MPF, and explores the consequences of different choices for these dynamics for training RBMs.
...
...

References

SHOWING 1-10 OF 70 REFERENCES
An analysis of Gaussian-binary restricted Boltzmann machines for natural images
TLDR
A rewritten formula of the probability density function as a linear superposition of Gaussians is derived and it is shown how Gaussian-binary RBMs learn natural image statistics.
Modeling pixel means and covariances using factorized third-order boltzmann machines
TLDR
This approach provides a probabilistic framework for the widely used simple-cell complex-cell architecture, it produces very realistic samples of natural images and it extracts features that yield state-of-the-art recognition accuracy on the challenging CIFAR 10 dataset.
Investigating Convergence of Restricted Boltzmann Machine Learning
TLDR
This work investigates the learning behavior of training algorithms by varying minimal set of parameters and shows that with relatively simple variants of CD, it is possible to obtain good results even without further regularization.
In All Likelihood, Deep Belief Is Not Enough
TLDR
A consistent estimator for the likelihood of deep belief networks is introduced which is computationally tractable and simple to apply in practice and finds that the deep belief network is outperformed with respect to the likelihood even by very simple mixture models.
Deep Boltzmann Machines
TLDR
A new learning algorithm for Boltzmann machines that contain many layers of hidden variables that is made more efficient by using a layer-by-layer “pre-training” phase that allows variational inference to be initialized with a single bottomup pass.
Factored 3-Way Restricted Boltzmann Machines For Modeling Natural Images
TLDR
A factored 3-way RBM is proposed that uses the states of its hidden units to represent abnormalities in the local covariance structure of an image to provide a probabilistic framework for the widely used simple/complex cell architecture.
Nonlinear and extra-classical receptive field properties and the statistics of natural scenes.
TLDR
This work uses multivariate wavelet statistics to demonstrate that a strictly linear processing would inevitably leave substantial statistical dependencies between the outputs of the units, and considers how the basic nonlinearities of cortical neurons--gain control and ON/OFF half-wave rectification--can exploit these higher-order statistical dependencies.
Nonlinear and extra-classical receptive field properties and the statistics of natural scenes
TLDR
Extensions of the previous investigations of the exploitation of higher-order statistics by nonlinear neurons are presented, showing that gain control provides an adaptation to the polar separability of the multivariate probability density function (PDF), and, together with an output nonlinearity, enables an overcomplete sparse coding.
Empirical Analysis of the Divergence of Gibbs Sampling Based Learning Algorithms for Restricted Boltzmann Machines
TLDR
The results indicate that the log-likelihood seems to diverge especially if the target distribution is difficult to learn for the RBM, and weight- Decay with a carefully chosen weight-decay-parameter can prevent divergence.
PROBABILISTIC FRAMEWORK FOR THE ADAPTATION AND COMPARISON OF IMAGE CODES
TLDR
The learned bases are shown to have better coding efficiency than traditional Fourier and wavelet bases and to provide a Bayesian solution to the problems of image denoising and filling in of missing pixels.
...
...