Parameter Space Compression Underlies Emergent Theories and Predictive Models

  title={Parameter Space Compression Underlies Emergent Theories and Predictive Models},
  author={Benjamin B. Machta and Ricky Chachra and Mark K. Transtrum and James P. Sethna},
  pages={604 - 607}
Information Physics Multiparameter models, which can emerge in biology and other disciplines, are often sensitive to only a small number of parameters and robust to changes in the rest; approaches from information theory can be used to distinguish between the two parameter groups. In physics, on the other hand, one does not need to know the details at smaller length and time scales in order to understand the behavior on large scales. This hierarchy has been recognized for a long time and… 

Perspective: Sloppiness and emergent theories in physics, biology, and beyond.

It is suggested that the reason the complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.

Generalized scale behavior and renormalization group for data analysis

This work focuses on one of the aspects of renormalization closely related to principal component analysis (PCA) for the case of large dimensional data sets with covariance having a nearly continuous spectrum and proposes a deeper formalism allowing to go beyond power-law assumptions for explicit computations.

Mutual information, neural networks and the renormalization group

This work demonstrates a machine-learning algorithm capable of identifying the relevant degrees of freedom of a system and executing RG steps iteratively without any prior knowledge about the system, and applies the algorithm to classical statistical physics problems in one and two dimensions.

Exploring the landscape of model representations

A statistical physics framework for exploring and quantitatively characterizing the space of order parameters for representing physical systems for a microscopic protein model and suggests an emergent length scale for coarse-graining proteins.

Information geometry for multiparameter models: New perspectives on the origin of simplicity

A Bayesian prior which optimizes the mutual information between model parameters and experimental data, naturally favoring points on the emergent boundary theories and thus simpler models, and a ‘projected maximum likelihood’ prior that efficiently approximates this optimal prior is introduced.

Relevance in the Renormalization Group and in Information Theory

It is shown analytically that for statistical physical systems described by a field theory the relevant degrees of freedom found using IB compression indeed correspond to operators with the lowest scaling dimensions, which provides a dictionary connecting two distinct theoretical toolboxes, and an example of constructively incorporating physical interpretability in applications of deep learning in physics.

Field theoretical renormalization group approach for signal detection

Renormalization group techniques are widely used in modern physics to describe the low energy relevant aspects of systems involving a large number of degrees of freedom. Those techniques are thus

The renormalization group via statistical inference

In physics, one attempts to infer the rules governing a system given only the results of imperfect measurements. Hence, microscopic theories may be effectively indistinguishable experimentally. We

Discovering Reduced-Order Dynamical Models From Data Discovering Reduced-Order Dynamical Models From Data

This work explores theoretical and computational principles for data-driven discovery of reduced-order models of physical phenomena through the lens of information geometry and explores how coarse-graining a system affects the local and global geometry of a “model manifold” which is the set of all models that could be fit using data from the system.



Parameter Space Compression Underlies Emergent Theories and Predictive Models

The microscopically complicated real world exhibits behavior that often yields to simple yet quantitatively accurate descriptions. Predictions are possible despite large uncertainties in microscopic

Relating Fisher information to order parameters.

The framework presented here reveals the basic thermodynamic reasons behind similar empirical observations reported previously and highlights the generality of Fisher information as a measure that can be applied to a broad range of systems, particularly those where the determination of order parameters is cumbersome.

Information geometry of finite Ising models

Scaling and Renormalization in Statistical Physics

This text provides a thoroughly modern graduate-level introduction to the theory of critical behaviour. Beginning with a brief review of phase transitions in simple systems and of mean field theory,

An exact mapping between the Variational Renormalization Group and Deep Learning

This work constructs an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs), and suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data.

Sloppy-model universality class and the Vandermonde matrix.

It is observed that the eigenvalue spectra for the sensitivity of sloppy models have a striking, characteristic form with a density of logarithms of eigenvalues which is roughly constant over a large range.

Universally Sloppy Parameter Sensitivities in Systems Biology Models

The results suggest that sloppy sensitivity spectra are universal in systems biology models and highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

Structural susceptibility and separation of time scales in the van der Pol oscillator.

It is shown that separating the time scales in the van der Pol system leads to a further separation of eigenvalues, which means that Parameter combinations which perturb the slow manifold are stiffer and those which solely affect the jumps in the dynamics are sloppier.

Sloppy models, parameter uncertainty, and the role of experimental design.

Computational models are increasingly used to understand and predict complex biological phenomena. These models contain many unknown parameters, at least some of which are difficult to measure

Why are nonlinear fits to data so challenging?

It is observed that the model manifold, in addition to being tightly bounded, has low extrinsic curvature, leading to the use of geodesics in the fitting process, and improves the convergence of the Levenberg-Marquardt algorithm by adding geodesic acceleration to the usual step.