Learn More
—We introduce a new class of problems called network information flow which is inspired by computer network applications. Consider a point-to-point communication network on which a number of information sources are to be mulitcast to certain sets of destinations. We assume that the information sources are mutually independent. The problem is to characterize(More)
• Abstract— In this paper we present a simple proof of the strong converse for identification via discrete memoryless quantum channels, based on a novel covering lemma. The new method is a generalization to quantum communication channels of Ahlswede's recently discovered appoach to classical channels. It involves a development of explicit large deviation(More)
The author determines for arbitrarily varying channels a) the average error capacity and b) the maximal error capacity in case of randomized encoding. A formula for the average error capacity in case of randomized encoding was announced several years ago by Dobrushin ([3]). Under a mild regularity condition this formula turns out to be valid and follows as(More)
We consider a sequence {Zg}i~ 1 of independent, identically distributed random variables where each Z i is a pair (Xi, Y/). For any pair of events {X"~ ~r { Y"~ N} satisfying Pr(Y" e NIX" s d) > 1-~ and for any non-negative real c we investigate how small Pr(Y"~) can be in case Pr(X"ed) is larger than 2-"c. We give the full answer to a generalized form of(More)
For the discrete memoryless channel (X ; Y; W) we give characterisations of the zero{ error erasure capacity C er and the zero{error average list size capacity C a` in terms of limits of suitable information resp. divergence quantities (Theorem 1). However, they don't \single{letterize". Next we assume that X Y and W(xjx) > 0 for all x 2 X , and we(More)