Corpus ID: 211572587

A Free-Energy Principle for Representation Learning

@article{Gao2020AFP,
  title={A Free-Energy Principle for Representation Learning},
  author={Yansong Gao and P. Chaudhari},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.12406}
}
This paper employs a formal connection of machine learning with thermodynamics to characterize the quality of learnt representations for transfer learning. We discuss how information-theoretic functional such as rate, distortion and classification loss of a model lie on a convex, so-called equilibrium surface.We prescribe dynamical processes to traverse this surface under constraints, e.g., an iso-classification process that trades off rate and distortion to keep the classification loss… Expand
An Information-Geometric Distance on the Space of Tasks
Controllable Guarantees for Fair Outcomes via Contrastive Information Estimation
Likelihood Ratio Exponential Families

References

SHOWING 1-10 OF 41 REFERENCES
On the Emergence of Invariance and Disentangling in Deep Representations
beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework
Maximally Informative Hierarchical Representations of High-Dimensional Data
Emergence of Invariance and Disentanglement in Deep Representations
Fixing a Broken ELBO
Entropy-SGD: Biasing Gradient Descent Into Wide Valleys
Adam: A Method for Stochastic Optimization
Auto-Encoding Variational Bayes
Multi-task Self-Supervised Visual Learning
...
1
2
3
4
5
...