• Corpus ID: 17761871

When are correlations strong

  title={When are correlations strong},
  author={Feraz Azhar and William Bialek},
  journal={arXiv: Neurons and Cognition},
The inverse problem of statistical mechanics involves finding the minimal Hamiltonian that is consistent with some observed set of correlation functions. This problem has received renewed interest in the analysis of biological networks; in particular, several such networks have been described successfully by maximum entropy models consistent with pairwise correlations. These correlations are usually weak in an absolute sense (e.g., correlation coefficients ~ 0.1 or less), and this is sometimes… 

Figures from this paper

Searching for Collective Behavior in a Large Network of Sensory Neurons

The properties of the neural vocabulary are explored by estimating its entropy, which constrains the population's capacity to represent visual information, and classifying activity patterns into a small set of metastable collective modes, showing that the neural codeword ensembles are extremely inhomogenous.

A dynamical state underlying the second order maximum entropy principle in neuronal networks

This work addresses the issue of why the second order maximum entropy model, by using only firing rates and second order correlations of neurons as constraints, can well capture the observed distribution of neuronal firing patterns in many neuronal networks, and explores a possible dynamical state in which this recursive relation gives rise to the strengths of higher order interactions always smaller than the lower orders.

Inferring structural connectivity using Ising couplings in models of neuronal networks

This paper studies the performance of the Ising model couplings to infer the synaptic connectivity in in silico networks of neurons and compares its performance against partial and cross-correlations for different correlation levels, firing rates, network sizes, network densities, and topologies.

Chaotic Dynamics in Networks of Spiking Neurons in the Balanced State

It is demonstrated that neural networks in the balanced state appear to generally exhibit chaotic dynamics, and a novel approach was introduced to thoroughly characterize neural network dynamics and quantify information preservation and erasure.

Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints

Author(s): Albanna, Badr Faisal | Advisor(s): DeWeese, Michael R | Abstract: Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured

The role of fluctuations in determining cellular network thermodynamics

This work develops a thermodynamic description of biological networks at the level of microscopic interactions between network variables, and conjecture that there is an upper limit to the rate of dissipative heat produced by a biological system that is associated with system size or modularity.

Properties of a Multidimensional Landscape Model for Determining Cellular Network Thermodynamics

The magnitudes of the landscape gradients and the dynamic correlated fluctuations of network variables are experimentally accessible and provide insight into the composition of the network and the relative thermodynamic contributions from network components.

Collective behaviour of social bots is encoded in their temporal Twitter activity

It is shown that, while pairwise correlations between users are weak, they co-exist with collective correlated states; however the statistics of correlations and co-spiking probability differ in both populations.



Weak pairwise correlations imply strongly correlated network states in a neural population

It is shown, in the vertebrate retina, that weak correlations between pairs of neurons coexist with strongly collective behaviour in the responses of ten or more neurons, and it is found that this collective behaviour is described quantitatively by models that capture the observed pairwise correlations but assume no higher-order interactions.

Spin glass models for a network of real neurons

It is shown that Pairwise interactions between neurons account for observed higher-order correlations, and that for groups of 10 or more neurons pairwise interactions can no longer be regarded as small perturbations in an independent system.

Faster solutions of the inverse pairwise Ising problem

A combination of recent coordinate descent algorithms with an adaptation of the histogram MonteCarlo method is used, and the resulting algorithm learns the parameters of an Ising model describing a network of forty neurons within a few minutes.

Pairwise Maximum Entropy Models for Studying Large Biological Systems: When They Can Work and When They Can't

This work provides a general framework for determining the extent to which pairwise models can be used to predict the behavior of large biological systems and shows that, in most cases, they will not.

Rediscovering the power of pairwise interactions

Two recent streams of work suggest that pairwise interactions may be sufficient to capture the complexity of biological systems ranging from protein structure to networks of neurons, and it is shown that they are mathematically equivalent.

A Maximum Entropy Model Applied to Spatial and Temporal Correlations from Cortical Networks In Vitro

Although a second-order maximum entropy model successfully predicts correlated states in cortical networks, it should be extended to account for temporal correlations observed between states, and a significant relationship between strong pairwise temporal correlations and observed sequence length is found.

Ising models for networks of real neurons

Ising models with pairwise interactions are the least structured, or maximum-entropy, probability distributions that exactly reproduce measured pairwise correlations between spins, and here they are constructed that describe the correlated spiking activity of populations of 40 neurons in the retina.

Constraint satisfaction problems and neural networks: A statistical physics perspective

Are Biological Systems Poised at Criticality?

This work reviews the surprising successes of this “inverse” approach to statistical mechanics models of biological systems directly from real data, using examples from families of proteins, networks of neurons, and flocks of birds.

Statistical mechanics of letters in words.

  • G. StephensW. Bialek
  • Physics
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2010
This work considers words as a network of interacting letters, and approximate the probability distribution of states taken on by this network, and suggests that these states provide an effective vocabulary which is matched to the frequency of word use and much smaller than the full lexicon.