The early days of information theory

  title={The early days of information theory},
  author={John R. Pierce},
  journal={IEEE Trans. Inf. Theory},
  • J. Pierce
  • Published 1973
  • Computer Science
  • IEEE Trans. Inf. Theory
Shannon's communication (information) theory cast about as much light on the problem of the communication engineer as can be shed. It reflected or encompassed earlier work, but it was so new that it was not really understood by many of those who first talked or wrote about it. Most of the papers published on information theory through 1950 are irrelevant to Shannon's work. Later work has given us useful information and encoding schemes as well as greater rigor, but the wisdom of Shannon's way… 

Engineering Theory and Mathematics in the Early Development of Information Theory

It is shown that the contributions of American communications engineering theorists are directly tied to the socially constructed meanings of information theory held by members of the group and that the early advancement of information is linked to the mutual interactions between these two social groups.

This is IT: A Primer on Shannon’s Entropy and Information

This tutorial reviews many aspects of the concept of entropy and information from a historical and mathematical point of view and culminates with a simple exposition of a recent proof of the entropy power inequality (EPI), one of the most fascinating inequality in the theory.

On Shannon's Formula and Hartley's Rule: Beyond the Mathematical Coincidence

A careful calculation shows that “Hartley’s rule” in fact coincides with Shannon's formula, and deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon‘s formula is explained.

Quantum Information Theory: Results and Open Problems

The discipline of information theory was founded by Claude Shannon in a truly remarkable paper [Sh] which laid down the foundations of the subject. We begin with a quote from this paper which is an

From Classical to Quantum Shannon Theory

Part V and VI are the culmination of this book, where all of the tools developed come into play for understanding many of the important results in quantum Shannon theory.

Information science as a paradigmatic instance of a problem-based theory

IT is apparently organized in a different way than the Aristotelian ideal, i.e., by means of an infinite sequence of deductive inferences from few self‐evident principles. In the past, several

On Development of Information Communications in Human Society

  • Bang Zhang
  • Computer Science
    Interdisciplinary Description of Complex Systems
  • 2019
It is near impossible to describe in detail such entire historical facts of information in human society in a paper, so the description and discussion is focused on their comprehensiveness and integrity.

Toward a genealogy of a cold war communication sciences: the strange loops of Leo and Norbert Wiener

A modest footnote in the mid-century annals of digital communication sciences, this article observes several strange loops in the dual biographies of Norbert Wiener, a primary founder of cybernetics

Entropy power inequalities and classical capacities of bosonic noise channels

The main tool is a quantum analog of the entropy power inequality introduced Shannon, which gives a lower bound on the output von Neumann entropy when two independent signals combine at a beamsplitter.

Properties of the Quantum Channel

This paper overviews the properties of the quantum communication channel, the various capacity measures and the fundamental differences between the classical and quantum channels.



A history of the theory of information

The paper mentions first some essential points about the early development of languages, codes and symbolism, picking out those fundamental points in human communication which have recently been

Communication in the presence of noise — Probability of error for two encoding schemes

Two encoding schemes are described in which the ideal rate is approached when the signal length is increased, an idea suggested by Shannon's observation that in an efficient encoding system the typical signal will resemble random noise.

Theoretical Limitations on the Rate of Transmission of Information

A review of early work on the theory of the transmission of information is followed by a critical survey of this work and a refutation of the point that, in the absence of noise, there is a finite

Transmission of information

A quantitative measure of “information” is developed which is based on physical as contrasted with psychological considerations. How the rate of transmission of this information over a system is

Communication theory of secrecy systems

  • C. Shannon
  • Computer Science, Mathematics
    Bell Syst. Tech. J.
  • 1949
A theory of secrecy systems is developed on a theoretical level and is intended to complement the treatment found in standard works on cryptography.

The theory of optimum noise immunity

Reading a book as this theory of optimum noise immunity and other references can enrich your life quality and help you to be better in this life.

Life, thermodynamics, and cybernetics.

HOW is it possible to understand life, when the whole world is ruled by such a law as the second principle of thermodynamics, which points toward death and annihilation? This question has been asked

Information and the Human Ear

Calculations of the informational capacity of the human ear are made by computing the number of discriminable sound patterns per second, and applying the Shannon information theory, and it is shown that a capacity of upwards from 5 × 104 bits/sec, depending on the informational match to the ear, is necessary for high fidelity transmission or recording.

Mathematical analysis of random noise

In this section we use the representations of the noise currents given in section 2.8 to derive some statistical properties of I(t). The first six sections are concerned with the probability

The mathematics of communication.

  • W. Weaver
  • Computer Science
    Scientific American
  • 1949
In communication there seem to be problems at three levels: 1) technical, 2) semantic, and 3) influential; but this is not the case, and some important recent work in the mathematical theory of communication is examined.