Skip to search formSkip to main contentSkip to account menu

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2016
2016
Physical Unclonable Functions (PUFs) are circuits which extract a device dependent secret from inherently available… 
2013
2013
We develop the hypothesis that an individual can get some value of information, even if they do not use the information for his… 
2011
2011
The importance of the collection and analysis of data on discontinuities cannot be overemphasized. Problems which include… 
2011
2011
A flexible joint robot manipulator can be regarded as a cascade of two subsystems: link dynamics and the motor dynamics. Using… 
2002
2002
We study the design of entropy-constrained multiterminal quantizers for coding two correlated continuous sources. Two design… 
Highly Cited
1995
Highly Cited
1995
For two-stage stochastic programs with integrality constraints in the second stage, we study continuity properties of the… 
1988
1988
An allometric model is used to estimate the productivity of natural phytoplankton assemblages. The 14C fixation rates of > 5 ~m… 
1987
1987
A number of writers have supposed that for the full specification of belief, higher order probabilities are required. Some have…