Raymond W. Yeung

Learn More
We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a point-to-point communication network on which a number of information sources are to be mulitcast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize(More)
Consider a communication network in which certain source nodes multicast information to other nodes on the network in the multihop fashion where every node can pass on any of its received data to others. We are interested in how fast each node can receive the complete information, or equivalently, what the information rate arriving at each node is. Allowing(More)
Given n discrete random variables = fX1; ; Xng, associated with any subset of f1; 2; ; ng, there is a joint entropy H(X ) where X = fXi: i 2 g. This can be viewed as a function defined on 2 2; ; ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the(More)
In the paradigm of network coding, the nodes in a network are allowed to encode the information received from the input links. With network coding, the full capacity of the network can be utilized. In this paper, we propose a model, call the wiretap network, that incorporates information security with network coding. In this model, a collection of subsets(More)
In Part I of this paper, we introduced the paradigm of network error correction as a generalization of classical link-by-link error correction. We also obtained the network generalizations of the Hamming bound and the Singleton bound in classical algebraic coding theory. In Part II, we prove the network generalization of the Gilbert-Varshamov bound and its(More)
Given n discrete random variables = fX1; ; Xng, associated with any subset of f1; 2; ; ng, there is a joint entropy H(X ) where X = fXi: i 2 g. This can be viewed as a function defined on 2 2; ; ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the(More)
Abstact --Let. Xi, i = 1,. . . , n, be discrete random variables, and Xi be a set variable corresponding to Xi. Define the universal set to be U ; = IXi and let S be the U-field generated by (Xi, i = 1,. . . , n}. It is shown that Shannon’s information measures on the random variables Xi, i = 1; .,n, constitute a unique measure p* on F, which is called the(More)