Learn More
We consider two fundamental tasks in quantum information theory, data compression with quantum side information as well as randomness extraction against quantum side information. We characterize these tasks for general sources using so-called one-shot entropies. These characterizations — in contrast to earlier results — enable us to derive tight second(More)
The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi(More)
The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi(More)
In classical and quantum information theory, operational quantities such as the amount of ran-domness that can be extracted from a given source or the amount of space needed to store given data are normally characterized by one of two entropy measures, called smooth min-entropy and smooth max-entropy, respectively. While both entropies are equal to the von(More)
The Leftover Hash Lemma states that the output of a two-universal hash function applied to an input with sufficiently high entropy is almost uniformly random. In its standard formulation, the lemma refers to a notion of randomness that is (usually implicitly) defined with respect to classical side information. Here, we prove a (strictly) more general(More)
This paper shows that the logarithm of the ε-error capacity (average error probability) for n uses of a discrete memoryless channel (DMC) is upper bounded by the normal approximation plus a third-order term that does not exceed 1 2 log n + O(1) if the ε-dispersion of the channel is positive. This matches a lower bound by Y. Polyanskiy (2010) for DMCs with(More)
We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of(More)
Recently, a variety of new measures of quantum Rényi mutual information and quantum Rényi conditional entropy have been proposed, and some of their mathematical properties explored. Here, we show that the Rényi mutual information attains operational meaning in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state(More)
The quantum capacity of a memoryless channel determines the maximal rate at which we can communicate reliably over asymptotically many uses of the channel. Here we illustrate that this asymptotic characterization is insufficient in practical scenarios where decoherence severely limits our ability to manipulate large quantum systems in the encoder and(More)