Density-based clustering: A ‘landscape view’ of multi-channel neural data for inference and dynamic complexity analysis

  title={Density-based clustering: A ‘landscape view’ of multi-channel neural data for inference and dynamic complexity analysis},
  author={Gabriel Baglietto and Guido Gigante and Paolo Del Giudice},
  journal={PLoS ONE},
Simultaneous recordings from N electrodes generate N-dimensional time series that call for efficient representations to expose relevant aspects of the underlying dynamics. Binning the time series defines a sequence of neural activity vectors that populate the N-dimensional space as a density distribution, especially informative when the neural dynamics proceeds as a noisy path through metastable states (often a case of interest in neuroscience); this makes clustering in the N-dimensional space… 


Optimal region of latching activity in an adaptive Potts model for networks of neurons
Noise is suggested as a fundamental factor in such alternations alongside adaptation in attractor-based models in which both the stochastic dynamics of Potts models and an adaptive potential function are present.
Ising model for neural data: model quality and approximate methods for extracting functional connectivity.
P pairwise Ising models for describing the statistics of multineuron spike trains are studied, using data from a simulated cortical network and the quality of these models is studied by comparing their entropies with that of the data, finding that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows.
Inferring Synaptic Structure in Presence of Neural Interaction Time Scales
A new two-step method is introduced that first infers through cross-correlation profiles the delay-structure of the network and then reconstructs the synaptic matrix, and it is found that the relationship between the inferred couplings and the real synaptic efficacies, albeit being quadratic in both cases, depends critically on dt for the excitatory synapses only, whilst being basically independent of it for the inhibitory ones.
Cortical activity flips among quasi-stationary states.
The results indicated the existence of well-separated states of activity, within which the firing rates were approximately stationary, supporting the hypothesis that these distinct states were brought about by the cooperative action of many neurons.
Ising Models for Inferring Network Structure From Spike Data
This chapter derives algorithms for finding the best model connection strengths for fitting a given data set, as well as faster approximate algorithms based on mean field theory that are tested on data from model networks and experiments.
Dynamics of Multistable States during Ongoing and Evoked Cortical Activity
A recurrent spiking network model is presented that accounts for both the spontaneous generation of state sequences and the multistability in single-neuron firing rates, and provides a theoretical framework that captures both ongoing and evoked network dynamics in a single mechanistic model.
Sequential Memory: A Putative Neural and Synaptic Dynamical Mechanism
It is shown that the short-term memory for a sequence of items can be implemented in an autoassociation neural network and uses adaptation rather than associative synaptic modification to recall the order of the items in a recently presented sequence.
Memory recall and spike-frequency adaptation.
It is shown that spike-frequency adaptation (SFA), a common mechanism affecting neuron activation in the brain, can provide state-dependent control of pattern retrieval and gives a plausible account of different sorts of memory retrieval.
Efficient Event-Driven Simulation of Large Networks of Spiking Neurons and Dynamical Synapses
The main impact of the new approach is a drastic reduction of the computational load incurred upon introduction of dynamic synaptic efficacies, which vary organically as a function of the activities of the pre- and postsynaptic neurons.