Skip to search formSkip to main contentSkip to account menu

Mutual information

Known as: Average Mutual Information, Transinformation, Mutual entropy 
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2009
2009
Spam is one of the main problems in emails communications. As the volume of non-english language spam increases, little work is… 
2008
2008
— The Gaussian wiretap channel model is studied when there are multiple antennas at the sender, the receiver and the eavesdropper… 
Highly Cited
2007
Highly Cited
2007
Next generation cellular radio systems will exceed the limitations of UMTS. The convergence of data and voice traffic will be… 
Highly Cited
2006
Highly Cited
2006
In the past years, a number of lexical association measures have been studied to help extract new scientific terminology or… 
Highly Cited
2006
Highly Cited
2006
Image enhancement of low-resolution images can be done through methods such as interpolation, super-resolution using multiple… 
Highly Cited
2006
Highly Cited
2006
We propose methods to estimate the secrecy-rate of fuzzy sources (e.g. biometrics and physical unclonable functions (PUFs)) using… 
Highly Cited
1997
Highly Cited
1997
A nonlinear system is considered where an aperiodic binary input signal is added to an arbitrarily distributed noise and compared… 
Highly Cited
1997
Highly Cited
1997
Independent component analysis (ICA) is a statistical method for transforming an observed multidimensional random vector into… 
Highly Cited
1995
Highly Cited
1995
Present virtual reality (VR) systems present powerful graphical information with human interaction capability but may be missing…