Weak universality in sensory tradeoffs.

@article{Marzen2016WeakUI,
  title={Weak universality in sensory tradeoffs.},
  author={Sarah E. Marzen and Simon Dedeo},
  journal={Physical review. E},
  year={2016},
  volume={94 6-1},
  pages={
          060101
        }
}
For many organisms, the number of sensory neurons is largely determined during development, before strong environmental cues are present. This is despite the fact that environments can fluctuate drastically both from generation to generation and within an organism's lifetime. How can organisms get by by hard coding the number of sensory neurons? We approach this question using rate-distortion theory. A combination of simulation and theory suggests that when environments are large, the rate… 

Figures from this paper

The evolution of lossy compression

This work uses a tool from information theory, rate–distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion (‘distortions’), and identifies two distinct regimes for organisms in these environments: a high-fidelity regime where perceptual costs grow linearly with environmental complexity, and a low-f fidelity regimeWhere perceptual costs are, remarkably, independent of the number of environmental states.

Information theory, predictability and the emergence of complex life

A minimal formal model grounded in information theory and selection is presented, in which successive generations of agents are mapped into transmitters and receivers of a coded message, which conjecture that the potential for predicting the environment can overcome the expenses associated with maintaining costly, complex structures.

Prediction and Dissipation in Nonequilibrium Molecular Sensors: Conditionally Markovian Channels Driven by Memoryful Environments

It is found that the seemingly impoverished Hill molecule can capture an order of magnitude more predictable information than large random channels, and the simplest nontrivial biological sensor model—that of a Hill molecule, characterized by the number of ligands that bind simultaneously—the sensor’s cooperativity.

Prediction and Power in Molecular Sensors: Uncertainty and Dissipation When Conditionally Markovian Channels Are Driven by Semi-Markov Environments

This work develops expressions for the predictive accuracy and thermodynamic costs of the broad class of conditionally Markovian sensors subject to unifilar hidden semi-Markov (memoryful) environmental inputs and studies the simplest nontrivial biological sensor model---that of a Hill molecule.

From typical sequences to typical genotypes.

How does an organism extract relevant information from transcription factor concentrations?

Early fly development is introduced as an exemplary system where information-theoretical approaches have traditionally been applied and can be applied and one such method has recently been used to infer structural features of enhancer architecture, namely the information bottleneck approach.

Minimal informational requirements for fitness.

This work develops a theoretical framework for understanding evolutionary "design principles" underlying information-cost trade-offs and introduces a correspondence between fitness and distortion and solve for the rate-distortion functions of several systems using analytical and numerical methods.

Fate of Duplicated Neural Structures

This work focuses on the fate of duplicated neural circuits, and derives phase diagrams and (phase-like) transitions between single and duplicated circuits, which constrain evolutionary paths to complex cognition.

Prediction and Dissipation in Nonequilibrium Molecular Sensors: Conditionally Markovian Channels Driven by Memoryful Environments

It is found that the seemingly impoverished Hill molecule can capture an order of magnitude more predictable information than large random channels, and the simplest nontrivial biological sensor model—that of a Hill molecule, characterized by the number of ligands that bind simultaneously—the sensor’s cooperativity.

Infinitely large, randomly wired sensors cannot predict their input unless they are close to deterministic

It is shown that infinitely large, randomly wired sensors are nonspecific for their input, and therefore nonpredictive of future input, unless they are close to deterministic.

References

SHOWING 1-10 OF 39 REFERENCES

The evolution of lossy compression

This work uses a tool from information theory, rate–distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion (‘distortions’), and identifies two distinct regimes for organisms in these environments: a high-fidelity regime where perceptual costs grow linearly with environmental complexity, and a low-f fidelity regimeWhere perceptual costs are, remarkably, independent of the number of environmental states.

Natural image statistics and neural representation.

It has long been assumed that sensory neurons are adapted to the statistical properties of the signals to which they are exposed, but recent developments in statistical modeling have enabled researchers to study more sophisticated statistical models for visual images, to validate these models empirically against large sets of data, and to begin experimentally testing the efficient coding hypothesis.

Optimal Prediction in the Retina and Natural Motion Statistics

This paper summarizes recent results on predictive coding and optimal predictive information in the retina and suggests approaches for quantifying prediction in response to natural motion.

The metabolic cost of neural information

Biophysical measurements from cells in the blowfly retina yield estimates of the energy required to generate graded (analog) electrical signals that transmit known amounts of information, which promotes the distribution of information among multiple pathways.

Energy Efficient Neural Codes

It is shown that for both binary and analog neurons, increased energy expenditure per neuron implies a decrease in average firing rate if energy efficient information transmission is to be maintained.

Neural coding of natural stimuli: information at sub-millisecond resolution

This issue, using the motion – sensitive neurons of the fly visual system as a test case, finds that significant amounts of visual information are represented by details of the spike train at millisecond and sub-millisecond precision, even though the sensory input has a correlation time of ~60 ms.

Maps in the brain: what can we learn from them?

It is argued that cortical maps reflect neuronal connectivity in intracortical circuits, and may be viewed as solutions that minimize wiring cost for given intrACortical connectivity.

Metabolically Efficient Information Processing

The Arimoto-Blahut algorithm, generalized for cost constraints, can be used to derive and interpret the distribution of symbols for optimal energy-efficient coding in the presence of noise, and the possibilities and problems are outlined.

Molecular interactions underlying the specification of sensory neurons