Information Theory

  title={Information Theory},
  author={L. S. Goddard},
Information TheoryPapers read at a Symposium on Information Theory held at the Royal Institution, London, August 29th to September 2nd, 1960. Edited by Colin Cherry. Pp. xi + 476. (London: Butterworth and Co. (Publishers), Ltd., 1961.) 95s. 
Information Theory: A Tutorial Introduction
This paper is an informal but rigorous introduction to the main ideas implicit in Shannon's theory and an annotated reading list is provided for further reading.
The field of Information Theory grew with researchers finding more results and insights into the fundamental problem of transmission of and storage using probabilistic models, and Raymond Yeung’s framework thus enabled to verify all the Shannon type inequalities.
A Rosetta Stone for information theory and differential equations
  • A. Selvitella
  • Computer Science, Mathematics
    Communications in Advanced Mathematical Sciences
  • 2018
This paper proposes a dictionary between Partial Differential Equations and Information Theory and discusses in detail the example of the Schrodinger Equation and Shannon Information Theory.
Multisource information theory
This article tries to develop its algorithmic information theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity.
An algebraic approach to information theory
  • M. Patra, S. Braunstein
  • Computer Science, Mathematics
    2010 IEEE International Symposium on Information Theory
  • 2010
An algebraic model of probability theory is given and several important theorems of classical probability and information theory are presented in the algebraic framework.
Information theory related learning
This is the introduction paper to a special session held on ESANN conference 2011. It reviews and highlights recent developments and new direction in information related learning, which is af astly
Basic Concepts, Identities and Inequalities - the Toolkit of Information Theory
Basic concepts and results of that part of Information Theory which is often referred to as "Shannon Theory" are discussed with focus mainly on the discrete case. The paper is expository with some
Fifty Years of Shannon Theory
  • S. Verdú
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1998
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.
Efficient Markets Meet the Shannon Limit (The Shannon Limit, Relative Channel Capacity, and Price Uncertainty)
A central concept of this paper is that information theory could help illuminate an important additional source of uncertainty in finance and economics. The communication constraints dictated by
Lecture notes on descriptional complexity and randomness
A didactical survey of the foundations of Algorithmic Information Theory. These notes are short on motivation, history and background but introduce some of the main techniques and concepts of the