Learn More
For a general sensory system following an external stochastic signal, we introduce the sensory capacity. This quantity characterizes the performance of a sensor: sensory capacity is maximal if the instantaneous state of the sensor has as much information about a signal as the whole time series of the sensor. We show that adding a memory to the sensor(More)
Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is(More)
The Fano factor, an observable quantifying fluctuations of product generation by a single enzyme, can reveal information about the underlying reaction scheme. A lower bound on this Fano factor that depends on the thermodynamic affinity driving the transformation from substrate to product constrains the number of intermediate states of an enzymatic cycle. So(More)
We generalize stochastic thermodynamics to include information reservoirs. Such information reservoirs, which can be modeled as a sequence of bits, modify the second law. For example, work extraction from a system in contact with a single heat bath becomes possible if the system also interacts with an information reservoir. We obtain an inequality, and the(More)
For current fluctuations in nonequilibrium steady states of Markovian processes, we derive four different universal bounds valid beyond the Gaussian regime. Different variants of these bounds apply to either the entropy change or any individual current, e.g., the rate of substrate consumption in a chemical reaction or the electron current in an electronic(More)
So far, feedback-driven systems have been discussed using (i) measurement and control, (ii) a tape interacting with a system, or (iii) by identifying an implicit Maxwell demon in steady-state transport. We derive the corresponding second laws from one master fluctuation theorem and discuss their relationship. In particular, we show that both the entropy(More)
For a paradigmatic model of chemotaxis, we analyze the effect how a nonzero affinity driving receptors out of equilibrium affects sensitivity. This affinity arises whenever changes in receptor activity involve ATP hydrolysis. The sensitivity integrated over a ligand concentration range is shown to be enhanced by the affinity, providing a measure of how much(More)
We derive expressions for the dispersion for two classes of random variables in Markov processes. Random variables such as current and activity pertain to the first class, which is composed of random variables that change whenever a jump in the stochastic trajectory occurs. The second class corresponds to the time the trajectory spends in a state (or(More)
For sensory networks, we determine the rate with which they acquire information about the changing external conditions. Comparing this rate with the thermodynamic entropy production that quantifies the cost of maintaining the network, we find that there is no universal bound restricting the rate of obtaining information to be less than this thermodynamic(More)