Capacity lower bound for the Ising perceptron

@article{Ding2019CapacityLB,
  title={Capacity lower bound for the Ising perceptron},
  author={Jian Ding and Nike Sun},
  journal={Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing},
  year={2019}
}
  • Jian Ding, Nike Sun
  • Published 20 September 2018
  • Mathematics, Computer Science
  • Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing
We consider the Ising perceptron with gaussian disorder, which is equivalent to the discrete cube {−1,+1}N intersected by M random half-spaces. The perceptron’s capacity is the largest integer MN for which the intersection is nonempty. It is conjectured by Krauth and Mézard (1989) that the (random) ratio MN/N converges in probability to an explicit constant α⋆≐ 0.83. Kim and Roche (1998) proved the existence of a positive constant γ such that γ ≤ MN/N ≤ 1−γ with high probability; see also… 

Figures from this paper

Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron
We consider the symmetric binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory
Critical Window of The Symmetric Perceptron
TLDR
The critical window of the symmetric binary perceptron has nearly the “sharpest possible transition,” adding it to a short list of CSP for which the critical window is rigorously known to be of near-constant width.
Frozen 1-RSB structure of the symmetric Ising perceptron
TLDR
It is proved, under an assumption on the critical points of a real-valued function, that the symmetric Ising perceptron exhibits the `frozen 1-RSB' structure; that is, typical solutions of the model lie in clusters of vanishing entropy density.
Algorithms and Barriers in the Symmetric Binary Perceptron Model
TLDR
At high enough densities the symmetric binary perceptron exhibits the multi Overlap Gap Property ( m − OGP), an intricate geometrical property known to be a rigorous barrier for large classes of algorithms.
Storage capacity in symmetric binary perceptrons
TLDR
The replica method is used to estimate the capacity threshold for the rectangle-binary-perceptron case when the u-function is wide and it is concluded that full-step-replica-symmetry breaking would have to be evaluated in order to obtain the exact capacity in this case.
Sharp threshold sequence and universality for Ising perceptron models
TLDR
The results of this paper apply in more general settings, and are based on new “add one constraint” estimates extending Talagrand’s estimates for the half-space model (1999, 2011).
Sharp threshold for the Ising perceptron model
  • Changjin Xu
  • Mathematics, Computer Science
    The Annals of Probability
  • 2021
TLDR
It is proved that this event has a sharp threshold; that is, the probability that the intersection is empty increases quickly from $\epsilon$ to $1- \ep silon$ when $p$ increases only by a factor of $1 + o(1)$ as $N \to \infty$.
Binary perceptron: efficient algorithms can find solutions in a rare well-connected cluster
TLDR
It is shown that at low constraint density, there exists indeed a subdominant connected cluster of solutions with almost maximal diameter, and that an efficient multiscale majority algorithm can find solutions in such a cluster with high probability, settling in particular an open problem posed by Perkins-Xu in STOC'21.
The discrepancy of random rectangular matrices
TLDR
A complete answer to the Beck–Fiala conjecture is given for two natural models: matrices with Bernoulli or Poisson entries, and the discrepancy for any rectangular aspect ratio is characterized.
Algorithmic pure states for the negative spherical perceptron
TLDR
An efficient algorithm is designed which, given oracle access to the solution of the Parisi variational principle, exploits this conjectured FRSB structure for $\kappa<0$ and outputs a vector $\sigma$ which is expected to be approximately the barycenter of a pure state of the spherical perceptron near criticality.
...
...

References

SHOWING 1-10 OF 45 REFERENCES
Covering Cubes by Random Half Cubes with Applications to Binary Neural Networks
LetQnbe the (hyper)cube {?1, 1}n. This paper is concerned with the following question: How many vectors must be chosen uniformly and independently at random fromQnbefore every vector inQnitself has
Universality of the SAT-UNSAT (jamming) threshold in non-convex continuous constraint satisfaction problems
TLDR
It is conjecture that there is a large universality class of non-convex continuous CSPs whose SAT-UNSAT threshold is described by the same scaling solution, and it is proposed that the perceptron is the simplest prototype of these problems.
TAP free energy, spin glasses and variational inference
We consider the Sherrington-Kirkpatrick model of spin glasses with ferromagnetically biased couplings. For a specific choice of the couplings mean, the resulting Gibbs measure is equivalent to the
The space of interactions in neural network models
The typical fraction of the space of interactions between each pair of N Ising spins which solve the problem of storing a given set of p random patterns as N-bit spin configurations is considered.
High dimensional robust M-estimation: asymptotic variance via approximate message passing
TLDR
It is shown here that that this phenomenon can be characterized rigorously using techniques that were developed by the authors for analyzing the Lasso estimator under high-dimensional asymptotics, and clarified that the ‘extra Gaussian noise’ encountered in this problem is fundamentally similar to phenomena already studied for regularized least squares in the setting of n.
Finite-sample analysis of Approximate Message Passing
TLDR
This paper derives a concentration result for AMP with i.i.d. Gaussian measurement matrices with finite dimension n × N, and shows that the probability of deviation from the state evolution prediction falls exponentially in n.
Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems
TLDR
A novel entropy-driven Monte Carlo strategy to efficiently sample solutions of random constraint satisfaction problems (CSPs) and a fast solver that relies exclusively on a local entropy estimate is constructed, and can be applied to general CSPs.
The simplest model of jamming
TLDR
It is shown that isostaticity is not a sufficient condition for singular force and gap distributions, and universality is hypothesized for a large class of non-convex constrained satisfaction problems with continuous variables.
The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
TLDR
This paper proves that indeed it holds asymptotically in the large system limit for sensing matrices with independent and identically distributed Gaussian entries, and provides rigorous foundation to state evolution.
The Thermodynamic Limit in Mean Field Spin Glass Models
Abstract: We present a simple strategy in order to show the existence and uniqueness of the infinite volume limit of thermodynamic quantities, for a large class of mean field disordered models, as
...
...