• Corpus ID: 5747983

A Mathematical Theory of Communication

  title={A Mathematical Theory of Communication},
  author={Jin Shin and Sang Joon Kim},
This paper opened the new area the information theory. Before this paper, most people believed that the only way to make the error probability of transmission as small as desired is to reduce the data rate (such as a long repetition scheme). However, surprisingly this paper revealed that it does not need to reduce the data rate for achieving that much of small errors. It proved that we can get some positive data rate that has the same small error probability and also there is an upper bound of… 

The birthday problem and zero-error list codes

This paper studies the performance of randomly generated codebooks over discrete memoryless channels under a zero-error constraint and leads to an information-theoretic formulation of the birthday problem, which is concerned with the probability that in a given population, a fixed number of people have the same birthday.

Error Probability Analysis of Binary Asymmetric Channels

The optimal (in the sense of minimum average error probability, using maximum likelihood decoding) code structure is derived for the cases of two, three, and four codewords and an arbitrary blocklength.

Fundamental Limits of Communication With Low Probability of Detection

This paper considers the problem of communication over a discrete memoryless channel (DMC) or an additive white Gaussian noise (AWGN) channel subject to the constraint that the probability that an

Message transmission in the presence of noise. Second asymptotic theorem and its various formulations

In this chapter, we provide the most significant asymptotic results concerning the existence of optimal codes for noisy channels. It is proven that the Shannon’s amount of information is a bound on

Limits of low-probability-of-detection communication over a discrete memoryless channel

This paper considers the problem of communication over a discrete memoryless channel subject to the constraint that the probability that an adversary who observes the channel outputs can detect the

Information Theory Tutorial Communication over Channels with memory

A general capacity formula C = sup X I(X; Y) is introduced, which is correct for arbitrary single-user channels without feedback, and how feedback can increase the channel capacity when the channel has memory is seen.

Channel Coding

  • A. Ahrens
  • Computer Science
    Video Coding for Wireless Communication Systems
  • 2018
According to Shannon theorem, as long as the authors allow a delay such that the enco ding is done in blocks of size n, the error probability is arbitrary low for R ≤ C and is 1 forR > C, whereC is the channel capacity.

Uncomputability of the generalized capacity

It is shown that there is no equivalent to the Blahut-Arimoto algorithm for computing the generalized capacity of a channel and that such an algorithm can not exist.

Concentration of Random-Coding Error Exponents

It is shown that the error exponent of a code, defined as the negative normalized logarithm of the probability of error, converges in probability to the typical error exponent.

On the Power of Feedback in Interactive Channels

It is shown that the interactive channel capacity of BSC with feedback and bit-flip probability is at least 1−O ( √ ), while the upper bound on BSC without feedback was recently shown to be 1− Ω( √ H( )).