Revisiting squared-error and cross-entropy functions for training neural network classifiers

@article{Kline2005RevisitingSA,
  title={Revisiting squared-error and cross-entropy functions for training neural network classifiers},
  author={Douglas Kline and Victor L. Berardi},
  journal={Neural Computing & Applications},
  year={2005},
  volume={14},
  pages={310-318}
}
This paper investigates the efficacy of cross-entropy and square-error objective functions used in training feed-forward neural networks to estimate posterior probabilities. Previous research has found no appreciable difference between neural network classifiers trained using cross-entropy or squared-error. The approach employed here, though, shows cross-entropy has significant, practical advantages over squared-error. 
Highly Cited
This paper has 71 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 38 extracted citations

Cost-sensitive multi-layer perceptron for binary classification with imbalanced data

2018 37th Chinese Control Conference (CCC) • 2018
View 3 Excerpts
Highly Influenced

Guiding Artificial Neural Networks Using Discriminatory Information In Hidden Layers

2017 IEEE International WIE Conference on Electrical and Computer Engineering (WIECON-ECE) • 2017
View 3 Excerpts
Highly Influenced

Convergence Analysis of Two Loss Functions in Soft-Max Regression

IEEE Transactions on Signal Processing • 2016
View 3 Excerpts
Highly Influenced

Detecting Sex From Handwritten Examples

2018 IEEE International Conference on System, Computation, Automation and Networking (ICSCA) • 2018

Detection of rail surface defects based on CNN image recognition and classification

2018 20th International Conference on Advanced Communication Technology (ICACT) • 2018
View 2 Excerpts

71 Citations

01020'10'12'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 71 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 21 references

Optimization of Recurrent NN by GA with Variable Length Genotype

Australian Joint Conference on Artificial Intelligence • 2002
View 1 Excerpt

Learning in feedforward layered networks : the tiling algorithm

Marc MCzardt, Jean-Pierre NadalS
2001
View 1 Excerpt

The effect of misclassification costs on neural network classifiers

VL Berardi, GQ Zhang
Decis Sci • 1999
View 1 Excerpt

A geneticbased method for learning the parameter of a fuzzy inference system

F Fagarasan, M NegoitaGh
1995

Neural networks for pattern recognition

CM Bishop
1995
View 2 Excerpts

Similar Papers

Loading similar papers…