On Variational Definition of Quantum Entropy

Abstract

Entropy of distribution P can be defined in at least three different ways: 1) as the expectation of the Kullback-Leibler (KL) divergence of P from elementary δ -measures (in this case, it is interpreted as expected surprise); 2) as a negative KL-divergence of some reference measure ν from the probability measure P; 3) as the supremum of Shannon’s mutual information taken over all channels such that P is the output probability, in which case it is dual of some transportation problem. In classical (i.e. commutative) probability, all three definitions lead to the same quantity, providing only different interpretations of entropy. In non-commutative (i.e. quantum) probability, however, these definitions are not equivalent. In particular, the third definition, where the supremum is taken over all entanglements of two quantum systems with P being the output state, leads to the quantity that can be twice the von Neumann entropy. It was proposed originally by V. Belavkin and Ohya [1] and called the proper quantum entropy, because it allows one to define quantum conditional entropy that is always non-negative. Here we extend these ideas to define also quantum counterpart of proper cross-entropy and cross-information. We also show inequality for the values of classical and quantum information.

Cite this paper

@inproceedings{Belavkin2014OnVD, title={On Variational Definition of Quantum Entropy}, author={Roman V. Belavkin}, year={2014} }