The early days of information theory
@article{Pierce1973TheED, title={The early days of information theory}, author={John R. Pierce}, journal={IEEE Trans. Inf. Theory}, year={1973}, volume={19}, pages={3-8} }
Shannon's communication (information) theory cast about as much light on the problem of the communication engineer as can be shed. It reflected or encompassed earlier work, but it was so new that it was not really understood by many of those who first talked or wrote about it. Most of the papers published on information theory through 1950 are irrelevant to Shannon's work. Later work has given us useful information and encoding schemes as well as greater rigor, but the wisdom of Shannon's way…
73 Citations
Engineering Theory and Mathematics in the Early Development of Information Theory
- Business
- 2004
It is shown that the contributions of American communications engineering theorists are directly tied to the socially constructed meanings of information theory held by members of the group and that the early advancement of information is linked to the mutual interactions between these two social groups.
This is IT: A Primer on Shannon’s Entropy and Information
- Computer Science
- 2021
This tutorial reviews many aspects of the concept of entropy and information from a historical and mathematical point of view and culminates with a simple exposition of a recent proof of the entropy power inequality (EPI), one of the most fascinating inequality in the theory.
On Shannon's Formula and Hartley's Rule: Beyond the Mathematical Coincidence
- Computer ScienceEntropy
- 2014
A careful calculation shows that “Hartley’s rule” in fact coincides with Shannon's formula, and deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon‘s formula is explained.
Quantum Information Theory: Results and Open Problems
- Computer Science
- 2000
The discipline of information theory was founded by Claude Shannon in a truly remarkable paper [Sh] which laid down the foundations of the subject. We begin with a quote from this paper which is an…
From Classical to Quantum Shannon Theory
- PhysicsArXiv
- 2011
Part V and VI are the culmination of this book, where all of the tools developed come into play for understanding many of the important results in quantum Shannon theory.
Information science as a paradigmatic instance of a problem-based theory
- Mathematics
- 1997
IT is apparently organized in a different way than the Aristotelian ideal, i.e., by means of an infinite sequence of deductive inferences from few self‐evident principles. In the past, several…
On Development of Information Communications in Human Society
- Computer ScienceInterdisciplinary Description of Complex Systems
- 2019
It is near impossible to describe in detail such entire historical facts of information in human society in a paper, so the description and discussion is focused on their comprehensiveness and integrity.
Toward a genealogy of a cold war communication sciences: the strange loops of Leo and Norbert Wiener
- Art
- 2013
A modest footnote in the mid-century annals of digital communication sciences, this article observes several strange loops in the dual biographies of Norbert Wiener, a primary founder of cybernetics…
Entropy power inequalities and classical capacities of bosonic noise channels
- Computer Science2013 IEEE Photonics Society Summer Topical Meeting Series
- 2013
The main tool is a quantum analog of the entropy power inequality introduced Shannon, which gives a lower bound on the output von Neumann entropy when two independent signals combine at a beamsplitter.
Properties of the Quantum Channel
- PhysicsArXiv
- 2012
This paper overviews the properties of the quantum communication channel, the various capacity measures and the fundamental differences between the classical and quantum channels.
References
SHOWING 1-10 OF 34 REFERENCES
A history of the theory of information
- EconomicsTrans. IRE Prof. Group Inf. Theory
- 1953
The paper mentions first some essential points about the early development of languages, codes and symbolism, picking out those fundamental points in human communication which have recently been…
Communication in the presence of noise — Probability of error for two encoding schemes
- Computer Science
- 1950
Two encoding schemes are described in which the ideal rate is approached when the signal length is increased, an idea suggested by Shannon's observation that in an efficient encoding system the typical signal will resemble random noise.
Theoretical Limitations on the Rate of Transmission of Information
- Computer ScienceProceedings of the IRE
- 1949
A review of early work on the theory of the transmission of information is followed by a critical survey of this work and a refutation of the point that, in the absence of noise, there is a finite…
Transmission of information
- Physics
- 1928
A quantitative measure of “information” is developed which is based on physical as contrasted with psychological considerations. How the rate of transmission of this information over a system is…
Communication theory of secrecy systems
- Computer Science, MathematicsBell Syst. Tech. J.
- 1949
A theory of secrecy systems is developed on a theoretical level and is intended to complement the treatment found in standard works on cryptography.
The theory of optimum noise immunity
- Education
- 1959
Reading a book as this theory of optimum noise immunity and other references can enrich your life quality and help you to be better in this life.
Life, thermodynamics, and cybernetics.
- EducationAmerican scientist
- 1949
HOW is it possible to understand life, when the whole world is ruled by such a law as the second principle of thermodynamics, which points toward death and annihilation? This question has been asked…
Information and the Human Ear
- Computer Science
- 1951
Calculations of the informational capacity of the human ear are made by computing the number of discriminable sound patterns per second, and applying the Shannon information theory, and it is shown that a capacity of upwards from 5 × 104 bits/sec, depending on the informational match to the ear, is necessary for high fidelity transmission or recording.
Mathematical analysis of random noise
- Mathematics
- 1944
In this section we use the representations of the noise currents given in section 2.8 to derive some statistical properties of I(t). The first six sections are concerned with the probability…
The mathematics of communication.
- Computer ScienceScientific American
- 1949
In communication there seem to be problems at three levels: 1) technical, 2) semantic, and 3) influential; but this is not the case, and some important recent work in the mathematical theory of communication is examined.