Gardner formula for Ising perceptron models at small densities
@inproceedings{Bolthausen2021GardnerFF, title={Gardner formula for Ising perceptron models at small densities}, author={Erwin Bolthausen and Shuta Nakajima and Nike Sun and Chang Xu}, booktitle={Annual Conference Computational Learning Theory}, year={2021} }
We consider the Ising perceptron model with N spins and M = N*alpha patterns, with a general activation function U that is bounded above. For U bounded away from zero, or U a one-sided threshold function, it was shown by Talagrand (2000, 2011) that for small densities alpha, the free energy of the model converges in the large-N limit to the replica symmetric formula conjectured in the physics literature (Krauth--Mezard 1989, see also Gardner--Derrida 1988). We give a new proof of this result…
8 Citations
Sharp threshold sequence and universality for Ising perceptron models
- 2023
Computer Science
SODA
We study a family of Ising perceptron models with $\{0,1\}$-valued activation functions. This includes the classical half-space models, as well as some of the symmetric models considered in recent…
Algorithms and Barriers in the Symmetric Binary Perceptron Model
- 2022
Computer Science
2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS)
It is shown that at high enough densities the SBP exhibits the multi Overlap Gap Property (m-OGP), an intricate geometrical property known to be a rigorous barrier for large classes of algorithms.
Injectivity of ReLU networks: perspectives from statistical physics
- 2023
Mathematics
ArXiv
When can the input of a ReLU neural network be inferred from its output? In other words, when is the network injective? We consider a single layer, $x \mapsto \mathrm{ReLU}(Wx)$, with a random…
Universality of Approximate Message Passing algorithms and tensor networks
- 2022
Computer Science
ArXiv
This work reduces AMP universality to the study of products of random matrices and diagonal tensors along a tensor network, and shows that the aforementioned matrix ensembles satisfy a notion of asymptotic freeness with respect to such tensor networks, which parallels usual definitions of freenness for traces of matrix products.
Geometric Barriers for Stable and Online Algorithms for Discrepancy Minimization
- 2023
Computer Science, Mathematics
ArXiv
This paper focuses on the algorithmic tractability of two models: (i) discrepancy minimization, and (ii) the symmetric binary perceptron (\texttt{SBP}), a random constraint satisfaction problem as well as a toy model of a single-layer neural network.
Free energy of a diluted spin glass model with quadratic Hamiltonian
- 2023
Mathematics
The Annals of Probability
We study a diluted mean-field spin glass model with a quadratic Hamiltonian. Our main result establishes the limiting free energy in terms of an integral of a family of random variables that are the…
Algorithmic Pure States for the Negative Spherical Perceptron
- 2022
Computer Science
Journal of Statistical Physics
Various characteristics of S such as its measure and the largest M for which it is non-empty, were computed heuristically in statistical physics in the asymptotic regime.
FREE ENERGY OF A DILUTED SPIN GLASS MODEL WITH QUADRATIC HAMILTONIAN
- 2022