Sequence complexity and work extraction

@article{Merhav2015SequenceCA,
  title={Sequence complexity and work extraction},
  author={Neri Merhav},
  journal={ArXiv},
  year={2015},
  volume={abs/1503.07653}
}
  • N. Merhav
  • Published 26 March 2015
  • Mathematics, Physics, Computer Science
  • ArXiv
We consider a simplified version of a solvable model by Mandal and Jarzynski, which constructively demonstrates the interplay between work extraction and the increase of the Shannon entropy of an information reservoir which is in contact with the physical system. We extend Mandal and Jarzynski's main findings in several directions: First, we allow sequences of correlated bits rather than just independent bits. Secondly, at least for the case of binary information, we show that, in fact, the… 
Relations Between Work and Entropy Production for General Information-Driven, Finite-State Engines
TLDR
Under very few assumptions, a simple derivation of a family of inequalities that relate the work extraction with the entropy production are proposed that are applicable to any finite number of cycles and for a general input information sequence (possibly correlated), which is not necessarily assumed even stationary.
Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets
We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely
Information ratchets exploiting spatially structured information reservoirs.
TLDR
This work shows that the dimensionality of the reservoir has a significant impact on the performance and phase diagram of the demon and derives exact probabilities of recurrence in these systems, generalizing previously known results.
Comments on "Identifying Functional Thermodynamics in Autonomous Maxwellian Ratchets" (arXiv:1507.01537v2)
The above article is about a family of Maxwell–like demons for which there are correlations between “information–bearing degrees of freedom” (quoting from the authors’ description of their work).
Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety
TLDR
It is shown that a ratchet must have memory to most effectively leverage structure and correlation in its environment, and achieving the IPSL bounds on the amount of work a ratchets can extract from its environment is investigated, discovering that finite-state, optimal ratcheted can go well beyond these bounds by utilizing their own infinite "negentropy”.
Above and Beyond the Landauer Bound: Thermodynamics of Modularity
TLDR
This work shows how to circumvent modularity dissipation by designing internal ratchet states that capture the global correlations and patterns in the ratchet’s information reservoir, and designed in this way, information ratchets match the optimum thermodynamic efficiency of globally integrated computations.
Correlation-powered Information Engines and the Thermodynamics of Self-Correction
TLDR
A thermodynamic mechanism based on nonergodicity that underlies error correction as it operates to support resilient engineered and biological systems is revealed.
Quantum and Information Thermodynamics: A Unifying Framework based on Repeated Interactions
We expand the standard thermodynamic framework of a system coupled to a thermal reservoir by considering a stream of independently prepared units repeatedly put into contact with the system. These
Measurement-feedback formalism meets information reservoirs
TLDR
A second-law-like inequality is derived by applying the measurement-feedback formalism to information reservoirs, which provides a stronger bound of extractable work than any other known inequality in the same setup.
Stochastic thermodynamics of information processing: bipartite systems with feedback, signal inference and information storage
TLDR
A framework for two continuously coupled systems, which includes information and refines the standard second law of thermodynamics of bipartite systems is developed, and a purely information theoretic quantity, which is called sensory capacity, is introduced.
...
1
2
...

References

SHOWING 1-10 OF 36 REFERENCES
Stochastic thermodynamics with information reservoirs.
  • A. C. Barato, U. Seifert
  • Mathematics, Medicine
    Physical review. E, Statistical, nonlinear, and soft matter physics
  • 2014
TLDR
An inequality is obtained that can derive an information processing entropy production, which gives the second law in the presence of information reservoirs, and a systematic linear response theory for information processing machines is developed.
Optimized finite-time information machine
We analyze a periodic optimal finite-time two-state information-driven machine that extracts work from a single heat bath exploring imperfect measurements. Two models are considered, a memory-less
Compression of individual sequences via variable-rate coding
TLDR
The proposed concept of compressibility is shown to play a role analogous to that of entropy in classical information theory where one deals with probabilistic ensembles of sequences rather than with individual sequences.
Data Processing Theorems and the Second Law of Thermodynamics
  • N. Merhav
  • Mathematics, Computer Science
    IEEE Transactions on Information Theory
  • 2011
TLDR
It turns out that both the generalized data processing theorems and the Boltzmann H-Theorem can be viewed as special cases of a more general principle concerning the monotonicity (in time) of a certain generalized information measure applied to a Markov process.
Perfectly secure encryption of individual sequences
  • N. Merhav
  • Mathematics, Computer Science
    2012 IEEE International Symposium on Information Theory Proceedings
  • 2012
TLDR
This work defines a similar notion of “finite-state encryptability” of an individual plaintext sequence, as the minimum asymptotic key rate that must be consumed by finite-state encrypters so as to guarantee perfect secrecy in a well-defined sense.
An autonomous and reversible Maxwell's demon
Building on a model introduced by Mandal and Jarzynski (Proc. Natl. Acad. Sci. U.S.A., 109 (2012) 11641), we present a simple version of an autonomous reversible Maxwell's demon. By changing the
Unifying three perspectives on information processing in stochastic thermodynamics.
TLDR
Stochastic thermodynamics is generalized to the presence of an information reservoir and it is shown that both the entropy production involving mutual information between system and controller and the one involving a Shannon entropy difference of an Information reservoir like a tape carry an extra term different from the usual current times affinity.
Thermodynamics of information
By its very nature, the second law of thermodynamics is probabilistic, in that its formulation requires a probabilistic description of the state of a system. This raises questions about the
Second-law-like inequalities with information and their interpretations
In a thermodynamic process with measurement and feedback, the second law of thermodynamics is no longer valid. In its place, various second-law-like inequalities have been advanced that each
Stochastic thermodynamics of bipartite systems: transfer entropy inequalities and a Maxwell’s demon interpretation
We consider the stationary state of a Markov process on a bipartite system from the perspective of stochastic thermodynamics. One subsystem is used to extract work from a heat bath while being
...
1
2
3
4
...