On the separation of lossy source-network coding and channel coding in wireline networks

@article{Jalali2010OnTS,
  title={On the separation of lossy source-network coding and channel coding in wireline networks},
  author={Shirin Jalali and Michelle Effros},
  journal={2010 IEEE International Symposium on Information Theory},
  year={2010},
  pages={500-504}
}
  • S. Jalali, M. Effros
  • Published 2 May 2010
  • Computer Science, Mathematics
  • 2010 IEEE International Symposium on Information Theory
This paper proves the separation between source-network coding and channel coding in networks of noisy, discrete, memoryless channels. We show that the set of achievable distortion matrices in delivering a family of dependent sources across such a network equals the set of achievable distortion matrices for delivering the same sources across a distinct network which is built by replacing each channel by a noiseless, point-to-point bit-pipe of the corresponding capacity. Thus a code that applies… 
Separation of Source-Network Coding and Channel Coding in Wireline Networks
TLDR
This paper proves the separation of source-network coding and channel coding in wireline networks and extends the separation result to the case of continuous-alphabet and point-to-point channels, such as additive white Gaussian noise channels.
Separation of Source-Network Coding and Channel Coding in Wireline Networks
  • S. Jalali, M. Effros
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2015
TLDR
This paper proves the separation of source-network coding and channel coding in wireline networks and extends the separation result to the case of continuous-alphabet and point-to-point channels, such as additive white Gaussian noise channels.
On source-channel separation in networks
TLDR
It is shown that the separation approach is optimal when the memoryless sources at source nodes are arbitrarily correlated, and the channels in this network are synchronized, orthogonal and memoryless.
On sources and networks: Can computational tools derive information theoretic limits?
  • M. Effros
  • Computer Science
    2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2011
TLDR
The goal of this work is to build computational tools to derive provable inner and outer bounds on the set of achievable distortions for any source-network coding problem.
Hybrid Coding: An Interface for Joint Source-Channel Coding and Network Communication
TLDR
A new approach to joint source‐channel coding is presented in the context of communicating correlated sources over multiple access channels, where the same codeword is used for both source coding and channel coding, which allows the resulting hybrid coding scheme to achieve the performance of the best known joint source-channel coding schemes.
Optimality and Approximate Optimality of Source-Channel Separation in Networks
TLDR
It is shown that the separation approach is optimal in two general scenarios and is approximately optimal in a third scenario, which generalizes the second scenario by allowing each source to be reconstructed at multiple destinations with different distortions.
Optimality and Approximate Optimality of Source-Channel Separation in Networks
TLDR
It is shown that the separation approach is optimal in two general scenarios and is approximately optimal in a third scenario, which generalizes the second scenario by allowing each source to be reconstructed at multiple destinations with different distortions.
A Unified Approach to Hybrid Coding
TLDR
A generalized hybrid coding technique is proposed for communication over discrete memoryless and Gaussian systems, and its utility is demonstrated via three examples-lossy joint source-channel coding over multiple access channels, channel coding over two-way relay channels, andChannel coding over diamond networks.
Network Coding and Distributed Compression over Large Networks: Some Basic Principles
TLDR
This work investigates several new approaches to bounding the achievable rate region for general network source coding problems - reducing a network to an equivalent network or collection of networks, investigating the effect of feedback on achievable rates, and characterizing the role of side information.
Reduced-Dimension Linear Transform Coding of Correlated Signals in Networks
TLDR
A model called the linear transform network (LTN) is proposed to analyze the compression and estimation of correlated signals transmitted over directed acyclic graphs (DAGs) and cut-set lower bounds on the distortion region of multi-source, multi-receiver networks are given.
...
1
2
...

References

SHOWING 1-10 OF 21 REFERENCES
A separation theorem for single-source network coding
TLDR
The result can be regarded as a network generalization of Shannon's result that feedback does not increase the capacity of a discrete memoryless channels (DMCs), and it implies a separation theorem for network coding and channel coding in such a communication network.
Separating distributed source coding from network coding
TLDR
This correspondence considers the problem of distributed source coding of multiple sources over a network with multiple receivers and shows that the problem with two sources and two receivers is always separable.
Linear Network Codes: A Unified Framework for Source, Channel, and Network Coding
TLDR
It is argued, rather than the lack of decomposability into canonical network modules, that presents major challenges for coding in networks, and it is shown that source-channel separation holds for several canonical network channels when the whole network operates over a common finite field.
Multiple user information theory
TLDR
A unified framework is given for multiple user information networks that consist of several users communicating to one another in the presence of arbitrary interference and noise and speculations about the form of a general theory of information flow in networks are offered.
Network information flow: limits and achievability
  • S. Borade
  • Mathematics
    Proceedings IEEE International Symposium on Information Theory,
  • 2002
An information theoretic upper bound on the information flow in discrete memoryless networks is found. The networks considered here can have multiple information sources and multiple sinks
The source-channel separation theorem revisited
The single-user separation theorem of joint source-channel coding has been proved previously for wide classes of sources and channels. We find an information-stable source/channel pair which does not
On a theory of network equivalence
TLDR
An equivalence result for network capacity is described that a collection of demands can be met on the given network if and only if it can be meet on another network where each noisy link is replaced by a noiseless bit pipe with throughput equal to the noisy link capacity.
Coordination Capacity
TLDR
This work asks what dependence can be established among the nodes of a communication network given the communication constraints, and develops elements of a theory of cooperation and coordination in networks.
A mathematical theory of communication
  • C. Shannon
  • Computer Science, Mathematics
    Bell Syst. Tech. J.
  • 1948
In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a
Probability: Theory and Examples
This book is an introduction to probability theory covering laws of large numbers, central limit theorems, random walks, martingales, Markov chains, ergodic theorems, and Brownian motion. It is a
...
1
2
3
...