The Ising decoder: reading out the activity of large neural ensembles

@article{Schaub2011TheID,
  title={The Ising decoder: reading out the activity of large neural ensembles},
  author={Michael T. Schaub and Simon R. Schultz},
  journal={Journal of Computational Neuroscience},
  year={2011},
  volume={32},
  pages={101-118}
}
  • M. Schaub, S. Schultz
  • Published 9 September 2010
  • Computer Science
  • Journal of Computational Neuroscience
The Ising model has recently received much attention for the statistical description of neural spike train data. In this paper, we propose and demonstrate its use for building decoders capable of predicting, on a millisecond timescale, the stimulus represented by a pattern of neural activity. After fitting to a training dataset, the Ising decoder can be applied “online” for instantaneous decoding of test data. While such models can be fit exactly using Boltzmann learning, this approach rapidly… 

Modeling Laminar Recordings from Visual Cortex with Semi-Restricted Boltzmann Machines

  • Computer Science
  • 2012
TLDR
This work proposes a more general maximum entropy model, the semi-restricted Boltzmann machine (sRBM), which extends the Ising model to capture higher order dependencies using hidden units, and highlights the importance of modeling higher order interactions across space and time to characterize activity in cortical networks.

Approximate Inference for Time-Varying Interactions and Macroscopic Dynamics of Neural Populations

TLDR
By introducing multiple analytic approximation methods to a state-space model of neural population activity, this work makes it possible to estimate dynamic pairwise interactions of up to 60 neurons and accurately estimates dynamics of network properties such as sparseness, entropy, and heat capacity by simulated data.

Ising models for neural activity inferred via selective cluster expansion: structural and coding properties

TLDR
The selective cluster expansion of the entropy, a method for inferring an Ising model which describes the correlated activity of populations of neurons, is described and differences in the inferred structure of retinal and cortical networks are found: inferred interactions tend to be more irregular and sparse for cortical data than for retinal data.

Entropy-based parametric estimation of spike train statistics

TLDR
This work considers the evolution of a network of neurons, focusing on the asymptotic behavior of spikes dynamics instead of membrane potential dynamics, and proposes a parametric statistical model where the probability has the form of a Gibbs distribution.

Neuronal Ensemble Decoding Using a Dynamical Maximum Entropy Model

TLDR
A novel decoder is developed that extends a maximum entropy decoder to take time-varying neural information into account and suggests that the proposed decoder may be able to infer neural information more effectively as it exploits dynamical properties of underlying neural networks.

The population tracking model: A simple, scalable statistical model for neural population data

TLDR
A new statistical method for characterizing neural population activity that requires semi-independent fitting of only as many parameters as the square of the number of neurons, so requiring drastically smaller data sets and minimal computation time is introduced.

Missing mass approximations for the partition function of stimulus driven Ising models

TLDR
This work presents an extremely fast, yet simply implemented, method for approximating the stimulus dependent partition function in minutes or seconds and demonstrates it requires orders of magnitude less computation time than Monte Carlo methods and can approximate the stimulus driven partition function more accurately than either Monte Carlo Methods or deterministic approximations.

Parametric models to relate spike train and LFP dynamics with neural information processing

TLDR
This work proposes a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing and extracts a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing.

On the Number of Neurons and Time Scale of Integration Underlying the Formation of Percepts in the Brain

TLDR
This work provides the first thorough interpretation of (feed-forward) percept formation from a population of sensory neurons, and discusses applications to experimental recordings in classic sensory decision-making tasks, which will hopefully provide new insights into the nature of perceptual integration.

The population tracking model : A simple , 1 scalable statistical model for neural population 2 data 3

TLDR
A new statistical method is introduced for characterizing neural population activity that requires semi-independent fitting of only as many parameters as the square of the number of neurons, so requiring drastically smaller data sets and minimal computation time.

References

SHOWING 1-10 OF 58 REFERENCES

Maximum entropy decoding of multivariate neural spike trains

TLDR
This work has examined the cross-validated performance of the algorithm by decoding patterns of activity in a 2D Ising Model in which stimulus dependent statistical structure has been imposed at different orders.

Ising model for neural data: model quality and approximate methods for extracting functional connectivity.

TLDR
P pairwise Ising models for describing the statistics of multineuron spike trains are studied, using data from a simulated cortical network and the quality of these models is studied by comparing their entropies with that of the data, finding that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows.

On Decoding the Responses of a Population of Neurons from Short Time Windows

The effectiveness of various stimulus identification (decoding) procedures for extracting the information carried by the responses of a population of neurons to a set of repeatedly presented stimuli

A Unified Approach to the Study of Temporal, Correlational, and Rate Coding

TLDR
It is demonstrated that the information contained in the spike occurrence times of a population of neurons can be broken up into a series of terms, each reflecting something about potential coding mechanisms, which forms the basis for a new quantitative procedure for analyzing simultaneous multiple neuron recordings and provides theoretical constraints on neural coding strategies.

Synergistic Coding by Cortical Neural Ensembles

TLDR
It is shown that conditional independence between neuronal outputs may not provide an optimal encoding strategy when the firing probability of a neuron depends on the history of firing of other neurons connected to it, and cooperation among neurons can provide a message-passing mechanism that preserves most of the information in the covariates under specific constraints governing their connectivity structure.

Inferring network connectivity using kinetic Ising models

TLDR
A kinetic Ising network with asymmetric connections, updated either asynchronously or synchronously, gives qualitatively good results, enabling us to identify the strongest connection reliably, though the magnitudes obtained tend to be off by a scaling factor that depends on noise level and mean firing rate.

The `Ideal Homunculus': decoding neural population signals

The Asynchronous State in Cortical Circuits

TLDR
It is shown theoretically that recurrent neural networks can generate an asynchronous state characterized by arbitrarily low mean spiking correlations despite substantial amounts of shared input, which generates negative correlations in synaptic currents which cancel the effect of sharedinput.

Temporal correlations and neural spike train entropy.

TLDR
A procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data, and yields insight to the role of correlations between spikes in temporal coding mechanisms.

The ‘Ideal Homunculus’: Statistical Inference from Neural Population Responses

TLDR
It is suggested here that Bayesian statistical inference can help answer these questions by allowing us to ‘read the neural code’ not only in the time domain but also across a population of neurons.
...