#### Filter Results:

#### Publication Year

2009

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

We consider two fundamental tasks in quantum information theory, data compression with quantum side information, as well as randomness extraction against quantum side information. We characterize these tasks for general sources using so-called one-shot entropies. These characterizations-in contrast to earlier results-enable us to derive tight second-order… (More)

The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi… (More)

The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi… (More)

In classical and quantum information theory, operational quantities such as the amount of randomness that can be extracted from a given source or the amount of space needed to store given data are normally characterized by one of two entropy measures, called smooth min-entropy and smooth max-entropy, respectively. While both entropies are equal to the von… (More)

The classical asymptotic equipartition property is the statement that, in the limit of a large number of identical repetitions of a random experiment, the output sequence is virtually certain to come from the typical set, each member of which is almost equally likely. In this paper, a fully quantum generalization of this property is shown, where both the… (More)

This paper shows that the logarithm of the ε-error capacity (average error probability) for n uses of a discrete memoryless channel (DMC) is upper bounded by the normal approximation plus a third-order term that does not exceed [ 1/ 2] logn +O(1) if the ε-dispersion of the channel is positive. This matches a lower bound by Y. Polyanskiy (2010)… (More)

This paper shows that, under the average error probability formalism, the third-order term in the normal approximation for the additive white Gaussian noise channel with a maximal or equal power constraint is at least (1/2) log n + O(1). This improves on the lower bound by Polyanskiy-Poor-Verdú (2010) and matches the upper bound proved by the same… (More)

Acknowledgements I was introduced to quantum information theory during my PhD studies in Renato Renner's group at ETH Zurich. It is from him that I learned most of what I know about quantum cryptography and smooth entropies. Renato also got me interested more generally in finite resource information theory as well as the entropies and other information… (More)

Recently, a variety of new measures of quantum Rényi mutual information and quantum Rényi conditional entropy have been proposed, and some of their mathematical properties explored. Here, we show that the Rényi mutual information attains operational meaning in the context of composite hypothesis testing, when the null hypothesis is a fixed bipartite state… (More)

- F Furrer, T Franz, M Berta, A Leverrier, V B Scholz, M Tomamichel +1 other
- Physical review letters
- 2012

We provide a security analysis for continuous variable quantum key distribution protocols based on the transmission of two-mode squeezed vacuum states measured via homodyne detection. We employ a version of the entropic uncertainty relation for smooth entropies to give a lower bound on the number of secret bits which can be extracted from a finite number of… (More)