Minimal informational requirements for fitness.

  title={Minimal informational requirements for fitness.},
  author={Alexander S. Moffett and Andrew W. Eckford},
  journal={Physical review. E},
  volume={105 1-1},
The existing concept of the "fitness value of information" provides a theoretical upper bound on the fitness advantage of using information concerning a fluctuating environment. Using concepts from rate-distortion theory, we develop a theoretical framework to answer a different pair of questions: What is the minimal amount of information needed for a population to achieve a certain growth rate? What is the minimal amount of information gain needed for one subpopulation to achieve a certain… 

Figures from this paper



The Value of Information for Populations in Varying Environments

This work presents a model of population dynamics where this problem is amenable to a mathematical analysis and shows that the bound on the value of information can be violated when accounting for features that are irrelevant in finance but inherent to biological systems, such as the stochasticity present at the individual level.

The fitness value of information

It is shown that in many cases the fitness value of a developmental cue, when measured this way, is exactly equal to the reduction in uncertainty about the environment, as described by the mutual information.

Fitness value of information with delayed phenotype switching: Optimal performance with imperfect sensing.

The ability of organisms to accurately sense their environment and respond accordingly is critical for evolutionary success. However, exactly how the sensory ability influences fitness is a topic of

Phenotypic diversity as an adaptation to environmental uncertainty

A graphical heuristic for determining the optimal amount of diversity in a fluctuating environment is developed, and it is confirmed that bet-hedging should be observed only within a certain range of environmental variation.

The evolution of lossy compression

This work uses a tool from information theory, rate–distortion theory, to study large, unstructured environments with fixed, randomly drawn penalties for stimuli confusion (‘distortions’), and identifies two distinct regimes for organisms in these environments: a high-fidelity regime where perceptual costs grow linearly with environmental complexity, and a low-f fidelity regimeWhere perceptual costs are, remarkably, independent of the number of environmental states.

Evolutionary Trade-Offs, Pareto Optimality, and the Geometry of Phenotype Space

It is found that best–trade-off phenotypes are weighted averages of archetypes—phenotypes specialized for single tasks, which could explain linear trait correlations, allometric relationships, as well as bacterial gene-expression patterns.

Informations in Models of Evolutionary Dynamics

An approach to quantify informations by analyzing mathematical models of evolutionary dynamics and how explicit results are obtained for a solvable subclass of these models is reviewed.

A new interpretation of information rate

The maximum exponential rate of growth of the gambler's capital is equal to the rate of transmission of information over the channel, generalized to include the case of arbitrary odds.

Weak universality in sensory tradeoffs.

A combination of simulation and theory suggests that when environments are large, the rate-distortion function-a proxy for material costs, timing delays, and energy requirements-depends only on coarse-grained environmental statistics that are expected to change on evolutionary, rather than ontogenetic, time scales.

Fate of a mutation in a fluctuating environment

It is found that even when a mutation experiences many environmental epochs before fixing or going extinct, its fate is not necessarily determined by its time-averaged selective effect, rendering selection unable to distinguish between mutations that are substantially beneficial and substantially deleterious on average.