Skip to search formSkip to main contentSkip to account menu

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2013
2013
We develop the hypothesis that an individual can get some value of information, even if they do not use the information for his… 
2009
2009
Distributed Denial of service (DDoS) attack has become one of the most serious threats to the Internet. DDoS attack can be… 
2004
2004
JEDEC standard for board level drop test of handheld electronic products addresses test requirements in details. However, the… 
2004
2004
In this paper, the implication of the relations of information in the case of multispectral images is analyzed. Higher-order… 
Highly Cited
1995
Highly Cited
1995
For two-stage stochastic programs with integrality constraints in the second stage, we study continuity properties of the… 
1988
1988
An allometric model is used to estimate the productivity of natural phytoplankton assemblages. The 14C fixation rates of > 5 ~m… 
1987
1987
A number of writers have supposed that for the full specification of belief, higher order probabilities are required. Some have…