# Memoryless Thermodynamics? A Reply

@article{Mandal2015MemorylessTA, title={Memoryless Thermodynamics? A Reply}, author={Dibyendu Mandal and Alexander B. Boyd and James P. Crutchfield}, journal={ArXiv}, year={2015}, volume={abs/1508.03311} }

Several years ago, Chris Jarzynski and one of us (DM)introduced a solvable model of a thermodynamic ratchetthat leveraged information to convert thermal energy towork [1, 2]. Our hope was to give a new level of under-standing of the Second Law of Thermodynamics and oneof its longest-lived counterexamples—Maxwell’s Demon.As it reads in “bits” from an input string Y, a detailed-balance stochastic multistate controller raises or lowers amass against gravity, writing “exhaust” bits to an…

## Topics from this paper

## 2 Citations

Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

- Physics, Mathematics
- 2015

We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely…

Correlation-powered Information Engines and the Thermodynamics of Self-Correction

- Mathematics, PhysicsPhysical review. E
- 2017

A thermodynamic mechanism based on nonergodicity that underlies error correction as it operates to support resilient engineered and biological systems is revealed.

## References

SHOWING 1-10 OF 16 REFERENCES

Unifying three perspectives on information processing in stochastic thermodynamics.

- Mathematics, PhysicsPhysical review letters
- 2014

Stochastic thermodynamics is generalized to the presence of an information reservoir and it is shown that both the entropy production involving mutual information between system and controller and the one involving a Shannon entropy difference of an Information reservoir like a tape carry an extra term different from the usual current times affinity.

Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets

- Physics, Mathematics
- 2015

We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely…

Work and information processing in a solvable model of Maxwell’s demon

- Physics, MedicineProceedings of the National Academy of Sciences
- 2012

A minimal model of an autonomous Maxwell demon, a device that delivers work by rectifying thermal fluctuations while simultaneously writing information to a memory register, offers a simple paradigm for investigating the thermodynamics of information processing by small systems.

Comments on "Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets" (arXiv:1507.01537v2)

- Physics
- 2015

The above article is about a family of Maxwell–like demons for which there are correlations between “information–bearing degrees of freedom” (quoting from the authors’ description of their work).…

Maxwell Demon Dynamics: Deterministic Chaos, the Szilard Map, and the Intelligence of Thermodynamic Systems.

- Physics, MedicinePhysical review letters
- 2016

The Szilard map is introduced, a deterministic chaotic system that encapsulates the measurement, control, and erasure protocol by which Maxwellian demons extract work from a heat reservoir and symmetrizes the demon and the thermodynamic system, allowing one to explore their functionality and recover the fundamental trade-off between the thermodynamics costs of dissipation due to measurement and those due to erasure.

Stochastic thermodynamics with information reservoirs.

- Mathematics, MedicinePhysical review. E, Statistical, nonlinear, and soft matter physics
- 2014

An inequality is obtained that can derive an information processing entropy production, which gives the second law in the presence of information reservoirs, and a systematic linear response theory for information processing machines is developed.

Regularities unseen, randomness observed: levels of entropy convergence.

- Mathematics, PhysicsChaos
- 2003

Several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes are synthesized by using successive derivatives of the Shannon entropy growth curve to look at the relationships between a process's entropy convergence behavior and its underlying computational structure.

Sequence complexity and work extraction

- Mathematics, PhysicsArXiv
- 2015

We consider a simplified version of a solvable model by Mandal and Jarzynski, which constructively demonstrates the interplay between work extraction and the increase of the Shannon entropy of an…

Computational Mechanics of Input–Output Processes: Structured Transformations and the $$\epsilon $$ϵ-Transducer

- Mathematics, Physics
- 2014

This work lays the foundation of a structural analysis of mechanisms that support information flow between processes, treating joint processes and processes with input and obtaining an analogous optimal model of the stochastic mapping between them.

Elements of Information Theory

- Engineering, Computer Science
- 1991

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.