Thermodynamic limits to information harvesting by sensory systems

  title={Thermodynamic limits to information harvesting by sensory systems},
  author={Stefano Bo and Marco Del Giudice and Antonio Celani},
  journal={Journal of Statistical Mechanics: Theory and Experiment},
In view of the relation between information and thermodynamics we investigate how much information about an external protocol can be stored in the memory of a stochastic measurement device given an energy budget. We consider a layered device with a memory component storing information about the external environment by monitoring the history of a sensory part coupled to the environment. We derive an integral fluctuation theorem for the entropy production and a measure of the information… 

Figures from this paper

Stochastic thermodynamics of information processing: bipartite systems with feedback, signal inference and information storage

A framework for two continuously coupled systems, which includes information and refines the standard second law of thermodynamics of bipartite systems is developed, and a purely information theoretic quantity, which is called sensory capacity, is introduced.

Energy and information flows in autonomous systems

. Multi-component molecular machines are ubiquitous in biology. We review recent progress on describing their thermodynamic properties using autonomous bipartite Markovian dynamics. The first and

Trade-Offs in Delayed Information Transmission in Biochemical Networks

It is found that feedback allows the circuit to overcome energy constraints and transmit close to the maximum available information even in the dissipationless limit, and the same universal motif is optimal in all of these conditions.

Fitness Gain of Individually Sensed Information by Cells

It is shown that an individually sensed signal always has a better fitness value, on average, than its mutual or directed information, and the optimizing fitness gain of individual sensing is shown to be related to fidelity allocations for individual environmental histories.

Information theory in biochemical regulatory networks: a theoretical study

This Thesis considers the optimization of information transmission as a viable design principle for biochemical networks, and applies this principle to a simple model regulatory circuit, finding that negative feedback loops are optimal at high dissipation, whereas positive feedback loops become more informative close to equilibrium conditions.

Prediction and Dissipation in Nonequilibrium Molecular Sensors: Conditionally Markovian Channels Driven by Memoryful Environments

It is found that the seemingly impoverished Hill molecule can capture an order of magnitude more predictable information than large random channels, and the simplest nontrivial biological sensor model—that of a Hill molecule, characterized by the number of ligands that bind simultaneously—the sensor’s cooperativity.

Prediction and Power in Molecular Sensors: Uncertainty and Dissipation When Conditionally Markovian Channels Are Driven by Semi-Markov Environments

This work develops expressions for the predictive accuracy and thermodynamic costs of the broad class of conditionally Markovian sensors subject to unifilar hidden semi-Markov (memoryful) environmental inputs and studies the simplest nontrivial biological sensor model---that of a Hill molecule.

Measurement-feedback formalism meets information reservoirs

A second-law-like inequality is derived by applying the measurement-feedback formalism to information reservoirs, which provides a stronger bound of extractable work than any other known inequality in the same setup.

Individual Sensing can Gain more Fitness than its Information

It is shown that an individually sensed signal always has a better fitness value, on average, than its mutual or directed information, and the optimizing fitness gain from individual sensing is shown to be related to fidelity allocations for individual environmental histories.

Nonequilibrium Thermodynamics of Chemical Reaction Networks: Wisdom from Stochastic Thermodynamics

We build a rigorous nonequilibrium thermodynamic description for open chemical reaction networks of elementary reactions. Their dynamics is described by deterministic rate equations satisfying mass



Efficiency of cellular information processing

We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This

The thermodynamics of prediction

Any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive, and the fundamental equivalence between this model inefficiency and thermodynamic inefficiency, measured by dissipation is exposed.

Information-theoretic vs. thermodynamic entropy production in autonomous sensory networks

  • A. C. BaratoU. Seifert
  • Computer Science
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2013
An upper bound on the rate of mutual information analytically is obtained and this rate is calculated with a numerical method that estimates the entropy of a time series generated with a simulation.

Experimental verification of Landauer’s principle linking information and thermodynamics

It is established that the mean dissipated heat saturates at the Landauer bound in the limit of long erasure cycles, demonstrating the intimate link between information theory and thermodynamics and highlighting the ultimate physical limit of irreversible computation.

Unifying three perspectives on information processing in stochastic thermodynamics.

Stochastic thermodynamics is generalized to the presence of an information reservoir and it is shown that both the entropy production involving mutual information between system and controller and the one involving a Shannon entropy difference of an Information reservoir like a tape carry an extra term different from the usual current times affinity.

Thermodynamic cost of measurements.

  • L. GrangerH. Kantz
  • Physics
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2011
A minimal framework for the modeling of a measurement device and a protocol for the measurement of thermal fluctuations is presented and illustrated on a simple two states system inspired by the Szilard's information engine.

Energetic costs of cellular computation

It is shown that learning about external concentrations necessitates the breaking of detailed balance and consumption of energy, with greater learning requiring more energy, which suggests that the energetic costs of cellular computation may be an important constraint on networks designed to function in resource poor environments.

Stochastic thermodynamics, fluctuation theorems and molecular machines

  • U. Seifert
  • Physics
    Reports on progress in physics. Physical Society
  • 2012
Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.

The energy-speed-accuracy tradeoff in sensory adaptation

This work identifies key requirements for the underlying biochemical network to achieve accurate adaptation with a given energy budget and provides a general framework to study cost-performance tradeoffs for cellular regulatory functions and information processing.

Nonequilibrium thermodynamics and nonlinear kinetics in a cellular signaling switch.

The result demonstrates the importance of nonequilibrium thermodynamics in analyzing biological information processing, provides its energetic cost, establishes an interplay between signal transduction and energy metabolism in cells, and suggests a biological function for phosphoenergetics in the ubiquitous phosphorylation signaling.