Nonequilibrium thermodynamics of self-supervised learning

  title={Nonequilibrium thermodynamics of self-supervised learning},
  author={Domingos S. P. Salazar},

Figures from this paper

Detecting Cybersecurity Attacks in Internet of Things Using Artificial Intelligence Methods: A Systematic Literature Review

A systematic literature review that categorize, map and survey the existing literature on AI methods used to detect cybersecurity attacks in the IoT environment and provides an insight into the AI roadmap to detect threats based on attack categories is presented.



Thermodynamic efficiency of learning a rule in neural networks

Using stochastic thermodynamics, it is shown that the thermodynamic costs of the learning process provide an upper bound on the amount of information that the network is able to learn from its teacher for both batch and online learning.

Spectral dynamics of learning in restricted Boltzmann machines

A generic statistical ensemble is proposed for the weight matrix of the RBM and its mean evolution is characterized, unveiling in some way how the selected modes interact in later stages of the learning procedure and defining a deterministic learning curve for the R BM.

Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences.

  • G. Crooks
  • Economics, Physics
    Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics
  • 1999
A generalized version of the fluctuation theorem is derived for stochastic, microscopically reversible dynamics and this generalized theorem provides a succinct proof of the nonequilibrium work relation.

Stochastic thermodynamics: principles and perspectives

Stochastic thermodynamics provides a framework for describing small systems like colloids or biomolecules driven out of equilibrium but still in contact with a heat bath. Both, a first-law like

Stochastic thermodynamics of learning

Stochastic thermodynamics is used to analyze the learning of a classification rule by a neural network and it is shown that the information acquired by the network is bounded by the thermodynamic cost of learning and a learning efficiency is introduced.

Deep autoregressive models for the efficient variational simulation of many-body quantum systems

This work proposes a specialized neural- network architecture that supports efficient and exact sampling, completely circumventing the need for Markov-chain sampling, and demonstrates the ability to obtain accurate results on larger system sizes than those currently accessible to neural-network quantum states.

Machine learning and the physical sciences

This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences, including conceptual developments in ML motivated by physical insights, applications of machine learning techniques to several domains in physics, and cross fertilization between the two fields.

Solving the quantum many-body problem with artificial neural networks

A variational representation of quantum states based on artificial neural networks with a variable number of hidden neurons and a reinforcement-learning scheme that is capable of both finding the ground state and describing the unitary time evolution of complex interacting quantum systems.

Inverse statistical problems: from the inverse Ising problem to data science

This review focuses on the inverse Ising problem and closely related problems, namely how to infer the coupling strengths between spins given observed spin correlations, magnetizations, or other data.

Insightful classification of crystal structures using deep learning

This study uses machine learning to automatically classify more than 100,000 simulated perfect and defective crystal structures, paving the way for crystal structure recognition of—possibly noisy and incomplete—three-dimensional structural data in big-data materials science.