Entropy landscape of solutions in the binary perceptron problem

@article{Huang2013EntropyLO,
  title={Entropy landscape of solutions in the binary perceptron problem},
  author={Haiping Huang and K. Y. Michael Wong and Yoshiyuki Kabashima},
  journal={ArXiv},
  year={2013},
  volume={abs/1304.2850}
}
The statistical picture of the solution space for a binary perceptron is studied. The binary perceptron learns a random classification of input random patterns by a set of binary synaptic weights. The learning of this network is difficult especially when the pattern (constraint) density is close to the capacity, which is supposed to be intimately related to the structure of the solution space. The geometrical organization is elucidated by the entropy landscape from a reference configuration and… 

Figures from this paper

Algorithms and Barriers in the Symmetric Binary Perceptron Model
TLDR
At high enough densities the symmetric binary perceptron exhibits the multi Overlap Gap Property ( m − OGP), an intricate geometrical property known to be a rigorous barrier for large classes of algorithms.
Clustering of solutions in the symmetric binary perceptron
TLDR
The first steps toward the rigorous proof of the existence of a dense cluster in certain regimes of the parameters are performed, by computing the first and second moment upper bounds for theexistence of pairs of arbitrarily close solutions.
Frozen 1-RSB structure of the symmetric Ising perceptron
TLDR
It is proved, under an assumption on the critical points of a real-valued function, that the symmetric Ising perceptron exhibits the `frozen 1-RSB' structure; that is, typical solutions of the model lie in clusters of vanishing entropy density.
2022 Equivalence between algorithmic instability and transition to replica symmetry breaking in perceptron learning systems
TLDR
The relationship between the algorithmic instability and the equilibrium analysis of the Binary perceptron model remains elusive, but it is established by showing that the instability condition around the algorithmic point is identical to the instability for breaking the replica symmetric saddle point solution of the free energy function.
Binary perceptron: efficient algorithms can find solutions in a rare well-connected cluster
TLDR
It is shown that at low constraint density, there exists indeed a subdominant connected cluster of solutions with almost maximal diameter, and that an efficient multiscale majority algorithm can find solutions in such a cluster with high probability, settling in particular an open problem posed by Perkins-Xu in STOC'21.
Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems
TLDR
A novel Entropy-driven Monte Carlo (EdMC) strategy to efficiently sample solutions of random Constraint Satisfaction Problems (CSPs) and a fast solver that relies exclusively on a local entropy estimate, and can be applied to general CSPs.
Equivalence between belief propagation instability and transition to replica symmetry breaking in perceptron learning systems
TLDR
Binary perceptron is a fundamental model of supervised learning for the non-convex optimization, which is a root of the popular deep learning, and the relationship between the algorithmic instability and the equilibrium analysis of the model remains elusive.
Storage capacity in symmetric binary perceptrons
TLDR
The problem of determining the capacity of the binary perceptron for two variants of the problem where the corresponding constraint is symmetric is studied, showing that the critical capacity is given by the annealed computation in a large region of parameter space.
Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron
We consider the symmetric binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory
Statistical physics of neural systems
TLDR
This work represents learning as an optimization problem, actually implementing a local search, in the synaptic space, of specific configurations, known as solutions and making a neural network able to accomplish a series of different tasks.
...
...

References

SHOWING 1-10 OF 53 REFERENCES
Dynamics of learning for the binary perceptron problem
A polynomial learning algorithm for a perceptron with binary bonds and random patterns is investigated within dynamic mean field theory. A discontinuous freezing transition is found at a temperature
Learning by random walks in the weight space of the Ising perceptron
TLDR
It is found that, for a given learning task, the solutions constructed by the random walk learning process are separated by a typical Hamming distance, which decreases with the constraint density a of the learning task; at a fixed value of a, the width of the Hammingdistance distribution decreases with N.
The space of interactions in neural network models
The typical fraction of the space of interactions between each pair of N Ising spins which solve the problem of storing a given set of p random patterns as N-bit spin configurations is considered.
Combined local search strategy for learning in networks of binary synapses
TLDR
A combined stochastic local search strategy in the synaptic weight space is constructed to further improve the learning performance of a single random walker to learn a same large set of patterns.
Broken symmetries in multilayered perceptrons.
The statistical mechanics of two-layered perceptrons with N input units, K hidden units, and a single output unit that makes a decision based on a majority rule (Committee Machine), is studied. Two
Weight space structure and analysis using a finite replica number in the Ising perceptron
TLDR
It is shown that the analyticity of the rate function changes at α = αGD = 1.245..., which implies that the dominant configuration of the atypically separable patterns exhibits a phase transition at this critical ratio.
LEARNING IN THE HYPERCUBE : A STEPPING STONE TO THE BINARY PERCEPTRON
TLDR
The learning problem for storing random patterns in a perceptron with binary weights can be facilitated by pretraining an appropriate precursor network with continuous weights, and an upper bound is determined for the fraction of binary weights any precursor is able to predict correctly.
Entropy landscape and non-Gibbs solutions in constraint satisfaction problems
TLDR
It is shown that a smoothed version of a decimation strategy based on belief propagation is able to find solutions belonging to subdominant clusters even beyond the so-called rigidity transition where the thermodynamically relevant clusters become frozen.
Computational complexity, learning rules and storage capacities: A Monte Carlo study for the binary perceptron
TLDR
The connection between the computational complexity of learning algorithms and the attained storage capacity is clarified and it is shown that a polynomial time cooling schedule yields a vanishing storage capacity in the thermodynamic limit as predicted by the dynamical theory of Horner.
Efficient supervised learning in networks with binary synapses
TLDR
A neurobiologically plausible on-line learning algorithm that derives from belief propagation algorithms that performs remarkably well in a model neuron with binary synapses, and a finite number of “hidden” states per synapse, that has to learn a random classification task.
...
...