# To code, or not to code: lossy source-channel communication revisited

@article{Gastpar2003ToCO, title={To code, or not to code: lossy source-channel communication revisited}, author={M. Gastpar and B. Rimoldi and M. Vetterli}, journal={IEEE Trans. Inf. Theory}, year={2003}, volume={49}, pages={1147-1158} }

What makes a source-channel communication system optimal? It is shown that in order to achieve an optimal cost-distortion tradeoff, the source and the channel have to be matched in a probabilistic sense. The match (or lack of it) involves the source distribution, the distortion measure, the channel conditional distribution, and the channel input cost function. Closed-form necessary and sufficient expressions relating the above entities are given. This generalizes both the separation-based… Expand

#### 499 Citations

To code or not to code: Revisited

- Mathematics, Computer Science
- 2012 IEEE Information Theory Workshop
- 2012

The dilemma of whether one should or should not code when operating under delay constraints is revisited and symbol-by-symbol transmission, though asymptotically suboptimal, might outperform not only separate source-channel coding but also the best known random-coding joint source- channel coding achievability bound in the finite blocklength regime. Expand

Strategies for Delay-Limited Source-Channel Coding

- Mathematics
- 2010

In point-to-point source-channel communication with a fidelity criterion and a transmission cost constraint, the region of achievable cost and fidelity pairs is completely characterized by Shannon's… Expand

Analog Matching of Colored Sources to Colored Channels

- Mathematics, Computer Science
- ISIT
- 2006

It is shown that by combining prediction and modulo-lattice arithmetic, it can match any stationary Gaussian source to any inter-symbol interference, colored-noise Gaussian channel, hence Shannon's optimum attainable performance R(D) = C. Expand

Lossless Transmission of Correlated Sources over a Multiple Access Channel with Side Information

- Computer Science
- 2007 Data Compression Conference (DCC'07)
- 2007

A source channel separation theorem is proved that there is no loss in performance in first applying distributed source coding where each encoder compresses its source conditioned on the side information at the receiver, and then applying an optimal multiple access channel code with independent codebooks. Expand

Information spectrum approach to the source channel separation theorem

- Mathematics, Computer Science
- 2014 IEEE International Symposium on Information Theory
- 2014

This work proves a stronger claim where the source is general, satisfying only a “sphere packing optimality” feature, and the channel is completely general, and if the channel satisfies the strong converse property, then the same statement can be made with davg, the average distortion level, replacing dmax. Expand

Slepian-Wolf coding over broadcast channels

- Computer Science
- IEEE Transactions on Information Theory
- 2006

It is shown with an example that an optimal joint source-channel coding strategy is strictly advantageous over the combination of stand-alone source and channel codes, and thus "informational separation" does not hold. Expand

Multiterminal Source–Channel Communication Over an Orthogonal Multiple-Access Channel

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2007

This work provides a characterization of the optimal tradeoff between the transmission cost Gamma and the distortion vector D as measured against individual sources, and determines the optimal power and distortion tradeoff in a quadratic Gaussian sensor network under orthogonal multiple access. Expand

Source-Channel Coding and Separation for Generalized Communication Systems

- Computer Science, Mathematics
- ArXiv
- 2009

We consider transmission of stationary and ergodic sources over non-ergodic composite channels with channel state information at the receiver (CSIR). Previously we introduced alternate capacity… Expand

On Zero-Delay Source-Channel Coding

- Computer Science, Mathematics
- IEEE Transactions on Information Theory
- 2014

It is shown that the Gaussian source-channel pair is unique in the sense that it is the only source- channel pair for which the optimal mappings are linear at more than one CSNR values. Expand

Quantifying Performance Losses in Source-Channel Coding

- 2007

In this paper, we identify and quantify loss factors causing sub-optimal performance in joint source-channel coding. We show that both the loss due to non-Gaussian distributed channel symbols and the… Expand

#### References

SHOWING 1-10 OF 20 REFERENCES

On joint source-channel coding for the Wyner-Ziv source and the Gel'fand-Pinsker channel

- Computer Science
- IEEE Trans. Inf. Theory
- 2003

A separation theorem is proved that there is no loss in asymptotic optimality in applying an optimal Wyner-Ziv source code and an optimal Gel'fand-Pinsker channel code to a lossy joint source-channel coding system. Expand

Theoretical limitations on the transmission of data from analog sources

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1965

The minimum mean-squared error is calculated for cases in which the analog source and additive channel noise are stationary, Gaussian processes, and the performance of amplitude and angle modulation systems is compared to the theoretically ideal performance obtainable in some of these cases. Expand

On source/channel codes of finite block length

- Mathematics
- Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252)
- 2001

For certain fortunate choices of source/channel pairs, all sophisticated coding is in vain: for them, a code of block length one is sufficient to achieve optimal performance. Is the set of "fortunate… Expand

The source-channel separation theorem revisited

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1995

The single-user separation theorem of joint source-channel coding has been proved previously for wide classes of sources and channels. We find an information-stable source/channel pair which does not… Expand

On the capacity of large Gaussian relay networks

- Computer Science
- IEEE Transactions on Information Theory
- 2005

This paper provides a new example where a simple cut-set upper bound is achievable, and one more example where uncoded transmission achieves optimal performance in a network joint source-channel coding problem. Expand

Duality between source coding and channel coding and its extension to the side information case

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 2003

This work begins with a mathematical characterization of the functional duality between classical source and channel coding, formulating the precise conditions under which the optimal encoder for one problem is functionally identical to the optimal decoder for the other problem. Expand

On the capacity of wireless networks: the relay case

- Computer Science
- Proceedings.Twenty-First Annual Joint Conference of the IEEE Computer and Communications Societies
- 2002

It is shown that lower and upper bounds meet asymptotically as the number of nodes in the network goes to infinity, thus proving that the capacity of the wireless network with n nodes under the relay traffic pattern behaves like log n bits per second. Expand

Successive refinement of information: characterization of the achievable rates

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1994

A model to describe a source with distortion no larger than /spl Delta//sub 1 and a more accurate description at distortion at R/sub 1. Expand

Successive refinement of information

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1991

It is shown that in order to achieve optimal successive refinement the necessary and sufficient conditions are that the solutions of the rate distortion problem can be written as a Markov chain and all finite alphabet signals with Hamming distortion satisfy these requirements. Expand

Beyond the separation principle: A broader approach to source-channel coding

- Mathematics
- 2002

Note: Invited Paper Reference LCM-CONF-2002-002 Record created on 2006-11-29, modified on 2017-05-12