Entropy (information theory)
Semantic Scholar uses AI to extract papers important to this topic.
Chapter 3 deals with probability distributions, discrete and continuous densities, distribution functions, bivariate… Expand Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances which are characterized… Expand We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper… Expand Abstract In statistical physics, useful notions of entropy are defined with respect to some coarse-graining procedure over a… Expand Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or… Expand Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the… Expand A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known… Expand Abstract A new nonprobabilistic entropy measure is introduced in the context of fuzzy sets or messages. Fuzzy units, or fits… Expand Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to… Expand 1. Entropy and mutual information 2. Discrete memoryless channels and their capacity-cost functions 3. Discrete memoryless… Expand