# A Mathematical Theory of Communication

@inproceedings{Shin2006AMT, title={A Mathematical Theory of Communication}, author={Jin Shin and Sang Joon Kim}, year={2006} }

This paper opened the new area the information theory. Before this paper, most people believed that the only way to make the error probability of transmission as small as desired is to reduce the data rate (such as a long repetition scheme). However, surprisingly this paper revealed that it does not need to reduce the data rate for achieving that much of small errors. It proved that we can get some positive data rate that has the same small error probability and also there is an upper bound of…

## 52,807 Citations

### The birthday problem and zero-error list codes

- Computer Science2017 IEEE International Symposium on Information Theory (ISIT)
- 2017

This paper studies the performance of randomly generated codebooks over discrete memoryless channels under a zero-error constraint and leads to an information-theoretic formulation of the birthday problem, which is concerned with the probability that in a given population, a fixed number of people have the same birthday.

### Error Probability Analysis of Binary Asymmetric Channels

- Computer Science
- 2010

The optimal (in the sense of minimum average error probability, using maximum likelihood decoding) code structure is derived for the cases of two, three, and four codewords and an arbitrary blocklength.

### Fundamental Limits of Communication With Low Probability of Detection

- Computer ScienceIEEE Transactions on Information Theory
- 2016

This paper considers the problem of communication over a discrete memoryless channel (DMC) or an additive white Gaussian noise (AWGN) channel subject to the constraint that the probability that an…

### Message transmission in the presence of noise. Second asymptotic theorem and its various formulations

- Computer Science
- 2020

In this chapter, we provide the most significant asymptotic results concerning the existence of optimal codes for noisy channels. It is proven that the Shannon’s amount of information is a bound on…

### Limits of low-probability-of-detection communication over a discrete memoryless channel

- Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

This paper considers the problem of communication over a discrete memoryless channel subject to the constraint that the probability that an adversary who observes the channel outputs can detect the…

### Information Theory Tutorial Communication over Channels with memory

- Computer Science, Mathematics
- 2005

A general capacity formula C = sup X I(X; Y) is introduced, which is correct for arbitrary single-user channels without feedback, and how feedback can increase the channel capacity when the channel has memory is seen.

### Channel Coding

- Computer ScienceVideo Coding for Wireless Communication Systems
- 2018

According to Shannon theorem, as long as the authors allow a delay such that the enco ding is done in blocks of size n, the error probability is arbitrary low for R ≤ C and is 1 forR > C, whereC is the channel capacity.

### Uncomputability of the generalized capacity

- Computer ScienceArXiv
- 2016

It is shown that there is no equivalent to the Blahut-Arimoto algorithm for computing the generalized capacity of a channel and that such an algorithm can not exist.

### Concentration of Random-Coding Error Exponents

- Computer Science2021 IEEE Information Theory Workshop (ITW)
- 2021

It is shown that the error exponent of a code, defined as the negative normalized logarithm of the probability of error, converges in probability to the typical error exponent.

### On the Power of Feedback in Interactive Channels

- Computer Science
- 2013

It is shown that the interactive channel capacity of BSC with feedback and bit-flip probability is at least 1−O ( √ ), while the upper bound on BSC without feedback was recently shown to be 1− Ω( √ H( )).