Corpus ID: 54735240

Optimization in an Error Backpropagation NeuralNetwork Environment with a Performance Teston a Pattern Classification Problem

@inproceedings{Fischer1998OptimizationIA,
  title={Optimization in an Error Backpropagation NeuralNetwork Environment with a Performance Teston a Pattern Classification Problem},
  author={M. Fischer and Petra Staufer-Steinnocher},
  year={1998}
}
Various techniques of optimizing the multiple class cross-entropy error function to train single hidden layer neural network classifiers with softmax output transfer functions are investigated on a real-world multispectral pixel-by-pixel classification problem that is of fundamental importance in remote sensing. These techniques include epoch-based and batch versions of backpropagation of gradient descent, PR-conjugate gradient and BFGS quasi-Newton errors. The method of choice depends… Expand
2 Citations
Is inductive machine learning just another wild goose (or might it lay the golden egg)?
  • M. Gahegan
  • Computer Science
  • Int. J. Geogr. Inf. Sci.
  • 2003
The case for inductive and visual techniques in the analysis of spatial data
  • M. Gahegan
  • Geography, Computer Science
  • J. Geogr. Syst.
  • 2000

References

SHOWING 1-10 OF 23 REFERENCES
Comparison of optimized backpropagation algorithms
Optimization for training neural nets
  • E. Barnard
  • Mathematics, Computer Science
  • IEEE Trans. Neural Networks
  • 1992
Training neural nets through stochastic minimization
A scaled conjugate gradient algorithm for fast supervised learning
  • M. Møller
  • Mathematics, Computer Science
  • Neural Networks
  • 1993
Neural Networks for Pattern Recognition
Neural networks for optimization and signal processing
Increased rates of convergence through learning rate adaptation
First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method
Adaptive pattern recognition and neural networks
...
1
2
3
...