Information Theory, Inference, and Learning Algorithms

@article{Mackay2004InformationTI,
  title={Information Theory, Inference, and Learning Algorithms},
  author={David J. C. Mackay},
  journal={IEEE Transactions on Information Theory},
  year={2004},
  volume={50},
  pages={2544-2545}
}
  • D. Mackay
  • Published 2004
  • Computer Science
  • IEEE Transactions on Information Theory
Fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering. 

Analysis of biological and chemical systems using information theoretic approximations

Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Biological Engineering, 2010.

Formally justified and modular Bayesian inference for probabilistic programs

This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, as to provide real-time information about concrete mechanical properties such as E-modulus and compressive strength.

On-the-fly machine learning of quantum mechanical forces and its potential applications for large scale molecular dynamics

School of Natural and Mathematical Sciences Department of Physics Doctor of Philosophy

Message-Passing for Inference and Optimization of Real Variables on Sparse Graphs

The inference and optimization in sparse graphs with real variables is studied using methods of statistical mechanics and numerical simulations show excellent performance and full agreement with the theoretical results.

Divergence measures for statistical data processing

This note provides a bibliography of investigations based on or related to divergence measures for theoretical and applied inference problems.

Efficient Methods for Unsupervised Learning of Probabilistic Models

In this thesis I develop a variety of techniques to train, evaluate, and sample from intractable and high dimensional probabilistic models. Abstract exceeds arXiv space limitations -- see PDF.

Information, Uncertainty, and Surprise

Measuring uncertainty/information, the Maximum Entropy Principle, and Binary search games: measuring uncertainty/ information.

Information Theory for Human and Social Processes

This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, which aims to provide real-time information about the physical and social sciences through the medium of tablets and smartphones.

Entropy, Inference, and Channel Coding

This article surveys application of convex optimization theory to topics in Information Theory and takes a fresh look at the relationships between channel coding and robust hypothesis testing and the structure of optimal input distributions in channel coding.

Bayesian Model and Dimension Reduction for Uncertainty Propagation: Applications in Random Media

Well-established methods for the solution of stochastic partial differential equations (SPDEs) typically struggle in problems with high-dimensional inputs/outputs. Such difficulties are only amplif...
...

References

SHOWING 1-10 OF 236 REFERENCES

Graphical Models for Machine Learning and Digital Communication

Probabilistic inference in graphical models pattern classification unsupervised learning data compression channel coding future research directions and how this affects research directions is investigated.

The Nature of Statistical Learning Theory

  • V. Vapnik
  • Computer Science
    Statistics for Engineering and Information Science
  • 2000
Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing

Statistical Decision Theory and Bayesian Analysis

An overview of statistical decision theory, which emphasizes the use and application of the philosophical ideas and mathematical structure of decision theory. The text assumes a knowledge of basic

Learning and relearning in Boltzmann machines

This chapter contains sections titled: Relaxation Searches, Easy and Hard Learning, The Boltzmann Machine Learning Algorithm, An Example of Hard Learning, Achieving Reliable Computation with

An Idiosyncratic Journey Beyond Mean Field Theory

This chapter contains sections titled: Introduction, Inference, Some Models from Statistical Physics, The Gibbs Free Energy, Mean Field Theory: The Variational Approach, Correcting Mean Field Theory,

Variational Gaussian process classifiers

The variational methods of Jaakkola and Jordan are applied to Gaussian processes to produce an efficient Bayesian binary classifier.

LDPC codes: a group algebra formulation

Bayesian Learning via Stochastic Dynamics

Bayesian methods avoid overfitting and poor generalization by averaging the outputs of many networks with weights sampled from the posterior distribution given the training data, by simulating a stochastic dynamical system that has the posterior as its stationary distribution.

“Haldane's Dilemma” and the Rate of Evolution

A recent estimate of the maximum rate of evolution by natural selection may be too low, based as it is on a maxim that seems to be erroneous.

Arithmetic coding for data compression

The state of the art in data compression is arithmetic coding, not the better-known Huffman method. Arithmetic coding gives greater compression, is faster for adaptive models, and clearly separates
...