Baseline control of optimal performance in recurrent neural networks

  title={Baseline control of optimal performance in recurrent neural networks},
  author={Shun Ogawa and Francesco Fumarola and Luca Mazzucato},
Changes in behavioral state, such as arousal and movements, strongly affect neural activity in sensory areas. Recent evidence suggests that they may be mediated by top-down projections regulating the statistics of baseline input currents to sensory areas, inducing qualitatively different effects across sensory modalities. What are the computational benefits of these baseline modulations? We investigate this question within a brain-inspired framework for reservoir computing, where we vary the… 


Input correlations impede suppression of chaos and learning in balanced rate networks
A non-stationary dynamic mean-field theory is developed that determines how the activity statistics and largest Lyapunov exponent depend on frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input.
Stimulus-dependent suppression of chaos in recurrent neural networks.
It is found that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated.
State-Dependent Regulation of Cortical Processing Speed via Gain Modulation
A theory explaining how the speed at which sensory cortex processes incoming information is adjusted by changes in these top-down projections, which control the timescale of neural activity is introduced.
Beyond the edge of chaos: amplification and temporal integration by recurrent networks in the chaotic regime.
This study analytically evaluates how well a small external input can be reconstructed from a sparse linear readout of network activity, and shows that, near the edge, decoding performance is characterized by a critical exponent that takes a different value on the two sides.
Expectation-induced modulation of metastable activity underlies faster coding of sensory stimuli
A novel computational mechanism underlying the expectation-dependent acceleration of coding observed in the gustatory cortex of alert rats is reported, providing a biologically plausible theory of expectation and ascribe an alternative functional role to intrinsically generated, metastable activity.
Optimal Sequence Memory in Driven Random Networks
This work investigates the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing, finding an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form.
Randomly connected networks generate emergent selectivity and predict decoding properties of large populations of neurons
This work finds that a random network model matches many features of experimental recordings, from single cells to populations, and suggests that distributed stimulus selectivity and patterns of functional organization in population codes could be emergent properties of randomly connected networks.
Chaos in Neuronal Networks with Balanced Excitatory and Inhibitory Activity
The hypothesis that the temporal variability in the firing of a neuron results from an approximate balance between its excitatory and inhibitory inputs was investigated theoretically.
The dynamic brain : an exploration of neuronal variability and its functional significance
This book examines neuronal variability from theoretical, experimental and clinical perspectives and raises the very distinct possibility that noise may in fact contain real, meaningful information which is available for the nervous system in information processing.
Behavioral states, network states, and sensory response variability.
It is suggested that the CNS may have evolved specifically to deal with stimulus variability and that the coupling with network states may be central to sensory processing.