Logical information theory: new logical foundations for information theory

@article{Ellerman2017LogicalIT,
  title={Logical information theory: new logical foundations for information theory},
  author={David Ellerman},
  journal={Logic Journal of the IGPL},
  year={2017},
  volume={25},
  pages={806-835}
}
There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability… CONTINUE READING