Blindfold learning of an accurate neural metric

@article{Gardella2018BlindfoldLO,
  title={Blindfold learning of an accurate neural metric},
  author={Christophe Gardella and Olivier Marre and Thierry Mora},
  journal={Proceedings of the National Academy of Sciences},
  year={2018},
  volume={115},
  pages={3267 - 3272}
}
  • C. Gardella, O. Marre, T. Mora
  • Published 13 October 2017
  • Biology, Computer Science, Psychology
  • Proceedings of the National Academy of Sciences
Significance To understand how neural signals code sensory stimuli, most approaches require knowing both the true stimulus and the neural response. The brain, however, only has access to the neural signals put out by sensory organs. How can it learn to relate neural responses to sensory stimuli, especially for responses to which it has never been exposed? Here we show how to solve this problem by building a metric on neural responses such that responses to the same stimulus are close. Although… 

Figures from this paper

Learning a neural response metric for retinal prosthesis
TLDR
This work proposes a method to learn a metric on neural responses directly from recorded light responses of a population of retinal ganglion cells (RGCs) in the primate retina and demonstrates that the learned metric leads to improvements in the performance of a retinal prosthesis.
Modeling a population of retinal ganglion cells with restricted Boltzmann machines
TLDR
This work validate the applicability of Restricted Boltzmann Machines to model the spiking activity responses of a large a population of RGCs recorded with high-resolution electrode arrays and shows that latent variables can encode modes in the RGC activity distribution that are closely related to the visual stimuli.
Optimal Encoding in Stochastic Latent-Variable Models
TLDR
This work finds that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content, and finds that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured.
Measures of neural similarity
TLDR
Across two published fMRI datasets, it was found the preferred neural similarity measures were common across brain regions, but differed across tasks, and Pearson correlation was consistently surpassed by alternatives.
Statistical modelling of neuronal population activity: from data analysis to network function
TLDR
This work contributed to developing a fast and scalable algorithm for spike sorting, which is based on action potential shapes and on the estimated location for the spike, and studied the physical properties of RBMs fitted to neural activity, finding they exhibit signatures of criticality, as observed before in similar models.
Modeling the Correlated Activity of Neural Populations: A Review
TLDR
A variety of models describing correlations between pairs of neurons, as well as between larger groups, synchronous or delayed in time, with or without the explicit influence of the stimulus, and including or not latent variables are covered.
The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules
TLDR
Overall, it is shown how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture.
The Information Theory of Developmental Pruning: Optimizing Global Network Architecture Using Local Synaptic Rules
TLDR
Overall, it is shown how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture.
Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics
TLDR
How the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models is reviewed.
Restricted Boltzmann Machines as Models of Interacting Variables
TLDR
It is shown that the weak parameter approximation is a good approximation for different RBMs trained on the MNIST data set, and in these cases, the mapping reveals that the inferred models are essentially low order interaction models.
...
...

References

SHOWING 1-10 OF 85 REFERENCES
Ruling out and ruling in neural codes
TLDR
The results show that standard coarse coding (spike count coding) is insufficient; finer, more information-rich codes are necessary.
Deep Neural Networks Rival the Representation of Primate IT Cortex for Core Visual Object Recognition
TLDR
These evaluations show that, unlike previous bio-inspired models, the latest DNNs rival the representational performance of IT cortex on this visual object recognition task and propose an extension of “kernel analysis” that measures the generalization accuracy as a function of representational complexity.
Closed-loop estimation of retinal network sensitivity reveals signature of efficient coding
TLDR
A method to characterize the sensitivity of the retinal network to perturbations of a stimulus is developed, and it is argued that a peak in the sensitivity is set to maximize information transmission.
Mapping a Complete Neural Population in the Retina
TLDR
This work shows that the combination of a large, dense multielectrode array and a novel, mostly automated spike-sorting algorithm allowed them to record simultaneously from a highly overlapping population of >200 ganglion cells in the salamander retina, allowing unprecedented access to the complete neural representation of visual information.
Decoding visual information from a population of retinal ganglion cells.
TLDR
This work investigates how a time-dependent visual stimulus is encoded by the collective activity of many retinal ganglion cells, and shows that the optimal interpretation of aganglion cell's action potential depends strongly on the simultaneous activity of other nearby cells.
Distinct time scales in cortical discrimination of natural sounds in songbirds.
TLDR
The existence of distinct time scales for temporal resolution and temporal integration is demonstrated and how they arise from cortical neural responses to complex dynamic sounds is explained.
Quality Time: Representation of a Multidimensional Sensory Domain through Temporal Coding
TLDR
It is found that for the more broadly tuned neurons in the NTS, the taste space is a systematic representation of the entire taste domain, and the way that taste quality is represented by the firing rate envelope is consistent across the population of cells.
Error-Robust Modes of the Retinal Population Code
TLDR
A novel statistical model is developed that decomposes the population response into modes and predicts the distribution of spiking activity in the ganglion cell population with high accuracy; it is found that the modes represent localized features of the visual stimulus that are distinct from the features represented by single neurons.
A thesaurus for a neural population code
TLDR
This work uses models of network encoding noise to learn a thesaurus for populations of neurons in the vertebrate retina responding to artificial and natural videos, measuring the similarity between population responses to visual stimuli based on the information they carry.
Modeling Retinal Ganglion Cell Population Activity with Restricted Boltzmann Machines
TLDR
Results show that binary states can encode the regularities associated to different stimuli, using both gratings and natural scenes as stimuli, and hidden variables encode interesting properties of retinal activity, interpreted as population receptive fields.
...
...