Skip to search formSkip to main contentSkip to account menu

Entropic vector

The entropic vector or entropic function is a concept arising in information theory. Shannon's information entropy measures and their associated… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
“Bounds on information combining” are entropic inequalities that determine how the information (entropy) of a set of random… 
2016
2016
In this work, we study the Yeung network coding entropic function outer bound and prove an equivalence relationship between its… 
2015
2015
The region of entropic vectors $\overline{\Gamma}^{*}_N$ has been shown to be at the core of determining fundamental limits for… 
2013
2013
This paper suggests that information geometry may form a natural framework to deal with the unknown part of the boundary of… 
2013
2013
The concept of quantum phase space offers a view on quantum mechanics, which is different from the standard Hilbert space… 
2012
2012
  • 2012
  • Corpus ID: 14192545
We show that two essentially conditional linear information inequalities (including the Zhang–Yeung’97 conditional inequality) do… 
2012
2012
A computational technique for determining rate regions for networks and multilevel diversity coding systems based on inner and… 
2012
2012
Given $n$ discrete random variables, its entropy vector is the $2^n-1$ dimensional vector obtained from the joint entropies of… 
2008
2008
Given n (discrete or continuous) random variables Xi, the (2n - 1)-dimensional vector obtained by evaluating the joint entropy of… 
2005
2005
Complex dynamic systems are wide spread in many industrial sectors such as manufacturing systems, queuing systems and supply…