#### Filter Results:

- Full text PDF available (13)

#### Publication Year

2008

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing… (More)

- David Hartich, Andre C Barato, Udo Seifert
- Physical review. E
- 2016

For a general sensory system following an external stochastic signal, we introduce the sensory capacity. This quantity characterizes the performance of a sensor: sensory capacity is maximal if the instantaneous state of the sensor has as much information about a signal as the whole time series of the sensor. We show that adding a memory to the sensor… (More)

- Andre C Barato, Udo Seifert
- Physical review letters
- 2015

Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is… (More)

- Andre C Barato, Udo Seifert
- The journal of physical chemistry. B
- 2015

The Fano factor, an observable quantifying fluctuations of product generation by a single enzyme, can reveal information about the underlying reaction scheme. A lower bound on this Fano factor that depends on the thermodynamic affinity driving the transformation from substrate to product constrains the number of intermediate states of an enzymatic cycle. So… (More)

- Andre C Barato, Udo Seifert
- Physical review. E, Statistical, nonlinear, and…
- 2014

We generalize stochastic thermodynamics to include information reservoirs. Such information reservoirs, which can be modeled as a sequence of bits, modify the second law. For example, work extraction from a system in contact with a single heat bath becomes possible if the system also interacts with an information reservoir. We obtain an inequality, and the… (More)

- Patrick Pietzonka, Andre C Barato, Udo Seifert
- Physical review. E
- 2016

For current fluctuations in nonequilibrium steady states of Markovian processes, we derive four different universal bounds valid beyond the Gaussian regime. Different variants of these bounds apply to either the entropy change or any individual current, e.g., the rate of substrate consumption in a chemical reaction or the electron current in an electronic… (More)

- A C Barato, U Seifert
- Physical review letters
- 2014

So far, feedback-driven systems have been discussed using (i) measurement and control, (ii) a tape interacting with a system, or (iii) by identifying an implicit Maxwell demon in steady-state transport. We derive the corresponding second laws from one master fluctuation theorem and discuss their relationship. In particular, we show that both the entropy… (More)

For a paradigmatic model of chemotaxis, we analyze the effect how a nonzero affinity driving receptors out of equilibrium affects sensitivity. This affinity arises whenever changes in receptor activity involve ATP hydrolysis. The sensitivity integrated over a ligand concentration range is shown to be enhanced by the affinity, providing a measure of how much… (More)

- Andre C Barato, Udo Seifert
- Physical review. E, Statistical, nonlinear, and…
- 2015

We derive expressions for the dispersion for two classes of random variables in Markov processes. Random variables such as current and activity pertain to the first class, which is composed of random variables that change whenever a jump in the stochastic trajectory occurs. The second class corresponds to the time the trajectory spends in a state (or… (More)

- A. C. Barato, U. Seifert
- Physical review. E, Statistical, nonlinear, and…
- 2013

For sensory networks, we determine the rate with which they acquire information about the changing external conditions. Comparing this rate with the thermodynamic entropy production that quantifies the cost of maintaining the network, we find that there is no universal bound restricting the rate of obtaining information to be less than this thermodynamic… (More)