Skip to search formSkip to main contentSkip to account menu

Cross entropy

Known as: Cross-entropy, Log loss, Minxent 
In information theory, the cross entropy between two probability distributions and over the same underlying set of events measures the average number… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2019
2019
Performance evaluation of urban autonomous vehicles (AVs) requires a realistic model of the behavior of other road users in the… 
2019
2019
Histology method is vital in the diagnosis and prognosis of cancers and many other diseases. For the analysis of… 
2013
2013
This paper presents a novel semantic-based phrase translation model. A pair of source and target phrases are projected into… 
2011
2011
By considering the temporal correlation and spatial correlation of the moving vehicles image,the method combining three-frame… 
2007
2007
A fuzzy error matrix can be used to summarize accuracy assessment information when both the map and reference data are labeled… 
Review
2006
Review
2006
The EM algorithm is a very powerful optimization method and has reached popularity in many fields. Unfortunately, EM is only a… 
Highly Cited
1993
Highly Cited
1993
  • M. Moher
  • 1993
  • Corpus ID: 122648042
An intuitive algorithm by Lodge et al. [1992] for iterative decoding of block codes is shown to follow from entropy optimization… 
1979
1979
The principle of maximum entropy and a generalization, the principle of minimum cross entropy, are prescriptions for solving…