Statistical physics of inference: thresholds and algorithms

  title={Statistical physics of inference: thresholds and algorithms},
  author={Lenka Zdeborov{\'a} and Florent Krzakala},
  journal={Advances in Physics},
  pages={453 - 552}
Many questions of fundamental interest in today's science can be formulated as inference problems: some partial, or noisy, observations are performed over a set of variables and the goal is to recover, or infer, the values of the variables based on the indirect information contained in the measurements. For such problems, the central scientific questions are: Under what conditions is the information contained in the measurements sufficient for a satisfactory inference to be possible? What are… 

High-dimensional inference: a statistical mechanics perspective

This article aims at emphasizing some of the deep links connecting these apparently separated disciplines through the description of paradigmatic models of high-dimensional inference in the language of statistical mechanics.

Statistical mechanics approaches to optimization and inference

This thesis will propose several statistical mechanics based models able to deal with two types of problems: optimization and inference problems and a set of combinatorial optimization problems on graphs, the Prize collecting and the Packing of Steiner trees problems.

Understanding Phase Transitions via Mutual Information and MMSE

This chapter presents a tutorial description of the standard linear model and its connection to information theory, and describes the replica prediction for this model and outlines the authors' recent proof that it is exact.

Inverse statistical problems: from the inverse Ising problem to data science

This review focuses on the inverse Ising problem and closely related problems, namely how to infer the coupling strengths between spins given observed spin correlations, magnetizations, or other data.

On the glassy nature of the hard phase in inference problems

This work shows that for noise-to-signal ratios that are below the information theoretic threshold, the posterior measure is composed of an exponential number of metastable glassy states and it is shown that this glassiness extends even slightly below the algorithmic threshold below which the well-known approximate message passing algorithm is able to closely reconstruct the signal.

Mean- eld theory of high-dimensional Bayesian inference

The goal is to recall basic (deep) concepts, as well as to provide some modern analytic and algorithmic tools used in high-dimensional inference.

Correspondence between thermodynamics and inference.

It is proposed that the Gibbs entropy provides a natural device for counting distinguishable distributions in the context of Bayesian inference and is used to define a generalized principle of indifference in which every distinguishable model is assigned equal a priori probability.

Fundamental problems in Statistical Physics XIV: Lecture on Machine Learning

  • A. Decelle
  • Computer Science
    Physica A: Statistical Mechanics and its Applications
  • 2022

Typology of phase transitions in Bayesian inference problems

The results show that the instability of the trivial fixed point is not generic evidence for the Bayes optimality of the message-passing algorithms, and shed light on the existence of hybrid-hard phases for a large class of planted constraint satisfaction problems.

Statistical physics of linear and bilinear inference problems

The aim of this thesis is to propose efficient algorithms for matrix compressed sensing and to perform their theoretical analysis, which reveals phases in which inference is easy, hard or impossible and of instabilities in Bayesian bilinear inference algorithms.



Understanding belief propagation and its generalizations

It is shown that BP can only converge to a fixed point that is also a stationary point of the Bethe approximation to the free energy, which enables connections to be made with variational approaches to approximate inference.

Statistical Physics of Hard Optimization Problems

A new class of problems is introduced which is named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.

Advanced mean field methods: theory and practice

The theoretical foundations of advanced mean field methods are covered, the relation between the different approaches are explored, the quality of the approximation obtained is examined, and their application to various areas of probabilistic modeling is demonstrated.

Statistical mechanics of learning from examples.

It is shown that for smooth networks, i.e., those with continuously varying weights and smooth transfer functions, the generalization curve asymptotically obeys an inverse power law, while for nonsmooth networks other behaviors can appear, depending on the nature of the nonlinearities as well as the realizability of the rule.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

This book is a valuable resource, both for the statistician needing an introduction to machine learning and related Ž elds and for the computer scientist wishing to learn more about statistics, and statisticians will especially appreciate that it is written in their own language.

The Nature of Computation

The authors explain why the P vs. NP problem is so fundamental, and why it is so hard to resolve, and lead the reader through the complexity of mazes and games; optimization in theory and practice; randomized algorithms, interactive proofs, and pseudorandomness; Markov chains and phase transitions; and the outer reaches of quantum computing.

Graphical Models, Exponential Families, and Variational Inference

The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

Probabilistic Inference Using Markov Chain Monte Carlo Methods

The role of probabilistic inference in artificial intelligence is outlined, the theory of Markov chains is presented, and various Markov chain Monte Carlo algorithms are described, along with a number of supporting techniques.

Statistical mechanics of unsupervised structure recognition

A model of unsupervised learning is studied, where the environment provides N-dimensional input examples that are drawn from two overlapping Gaussian clouds and how well the underlying structure is inferred from a set of examples is investigated.

Statistical mechanics of community detection.

The properties of the ground state configuration are elucidated to give a concise definition of communities as cohesive subgroups in networks that is adaptive to the specific class of network under study.