Skip to search formSkip to main contentSkip to account menu

Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2013
2013
We develop the hypothesis that an individual can get some value of information, even if they do not use the information for his… 
2009
2009
Distributed Denial of service (DDoS) attack has become one of the most serious threats to the Internet. DDoS attack can be… 
Highly Cited
2007
Highly Cited
2007
Time delay estimation (TDE) is a basic technique for numerous applications where there is a need to localize and track a… 
2004
2004
JEDEC standard for board level drop test of handheld electronic products addresses test requirements in details. However, the… 
Highly Cited
1995
Highly Cited
1995
For two-stage stochastic programs with integrality constraints in the second stage, we study continuity properties of the… 
1988
1988
An allometric model is used to estimate the productivity of natural phytoplankton assemblages. The 14C fixation rates of > 5 ~m…