# Recent Contributions to The Mathematical Theory of Communication

@inproceedings{Weaver2009RecentCT, title={Recent Contributions to The Mathematical Theory of Communication}, author={Warren Weaver and Claude E. Shannon}, year={2009} }

This paper is written in three main sections. In the first and third, W. W. is responsible both for the ideas and the form. The middle section, namely “2) Communication Problems at Level A” is an interpretation of mathematical papers by Dr. Claude E. Shannon of the Bell Telephone Laboratories. Dr. Shannon’s work roots back, as von Neumann has pointed out, to Boltzmann’s observation, in some of his work on statistical physics (1894), that entropy is related to “missing information,” inasmuch as…

## 312 Citations

Shannon and Weaver model of Communication

- Computer Science
- 2010

While Shannon was focused on engineering aspect of his theory, Weaver developed the philosophical aspects of this theory related to human communication and extended and applied Shannon's information theory for different kinds of communication.

Information theory and the ethylene genetic network

- Computer SciencePlant signaling & behavior
- 2011

A novel proposal is discussed that consists of the modeling of gene expression with a stochastic approach that allows Shannon entropy (H) to be directly used to measure the amount of uncertainty that the genetic machinery has in relation to the correct decoding of a message transmitted into the nucleus by a signaling pathway.

A Synergetic Theory of Information

- Computer ScienceInf.
- 2019

It is shown that various measures of information are structural characteristics of integrative codes of elements of discrete systems, and the synergetic approach to the definition of the quantity of information is primary in relation to the approaches of Hartley and Shannon.

A Generalized Information Formula as the Bridge between Shannon and Popper

- Computer ScienceArXiv
- 2007

The paper introduces how to select a prediction or sentence from many for forecasts and language translations according to the generalized information criterion and introduces the rate fidelity theory, which comes from the improvement of the rate distortion theory in the classical information theory by replacing distortion with the generalized mutual information criterion.

The Entropy Universe

- Computer ScienceEntropy
- 2021

The Entropy Universe is presented, which aims to review the many variants of entropies applied to time-series for different scientific fields, establishing bases for researchers to properly choose the variant of entropy most suitable for their data.

"A Measure of Disorder" - Entropie als Metapher für das Andere der Ordnung "A Measure of Disorder" - Entropy as Metaphor for the Other of Order

- Computer Science
- 2014

The issue of entropy remains an unclear and a heterogeneous notion which plays a major role in theorizing (and measuring) the other side of order and it is inappropriate to translate entropy into the social sciences and hence as a justification of pessimistic prospects.

Mutual redundancies in interhuman communication systems: Steps toward a calculus of processing meaning

- Computer ScienceJ. Assoc. Inf. Sci. Technol.
- 2014

Shannon's theory is extended by defining mutual redundancy as a positional counterpart of the relational communication of information as the surplus of meanings that can be provided to the exchanges in reflexive communications.

Quantum Error Correction Codes

- Computer Science, Physics
- 2020

This review paper claims why developing the quantum error correction codes are critical in Sec.

Information, Meaning, and Intellectual Organization in Networks of Inter-Human Communication

- Computer ScienceArXiv
- 2014

The Shannon-Weaver model of linear information transmission is extended with two loops potentially generating redundancies: (i) meaning is provided locally to the information from the perspective of…

Synergetic Theory of Information

- Computer Science
- 2018

The conclusion is made that, from the informational-genetic positions, the synergetic theory of information is primary in relation to the Hartley-Shannon information theory.