• Corpus ID: 60563397

Neural networks for pattern recognition

@inproceedings{Bishop1995NeuralNF,
  title={Neural networks for pattern recognition},
  author={Christopher M. Bishop},
  year={1995}
}
From the Publisher: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization… 

Figures from this paper

Neural Networks for Pattern Recognition, Image and Signal Processing
In this paper Neural Networks are presented in the context of Statistical Pattern Recognition, focusing the attention on all the steps needed to classify and interpolate input data. Standard
Pattern Recognition and Neural Networks
TLDR
Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks in this self-contained account.
Functional networks training algorithm for statistical pattern recognition
  • E. A. El-Sebakhy
  • Computer Science
    Proceedings. ISCC 2004. Ninth International Symposium on Computers And Communications (IEEE Cat. No.04TH8769)
  • 2004
TLDR
This work uses functional equations to approximate the neuron functions, which allow a wide class of functions to be presented and the steps of working with functional networks and the structural learning are proposed.
Ranking Pattern Recognition Features for Neural Networks
TLDR
This paper presents a practical technique for ranking features in terms of significance for a neural-net pattern recognizer, and provides the results of applying this clamping technique to a small selection of problems that demonstrate its practical worth.
Principal Feature Networks for Pattern Recognition
  • Qi Li
  • Computer Science
  • 2012
TLDR
A different approach for neural network training and construction is introduced that was developed by the author and Tufts and named the principal feature network (PFN), which is an analytical method to construct a classifier or recognizer.
Algorithmic Synthesis in Neural Network Training for Pattern Recognition
A large number of theoretical results establish the potential of Artificial Neural Networks (ANNs) as universal function approximators. ANNs importance in this context is that they offer a very
Circular backpropagation networks for classification
TLDR
The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by backpropagation.
...
...

References

SHOWING 1-10 OF 251 REFERENCES
Introduction to Statistical Pattern Recognition-Second Edition
TLDR
This completely revised second edition presents an introduction to statistical pattern recognition, which is appropriate as a text for introductory courses in pattern recognition and as a reference book for workers in the field.
Probabilistic neural networks
Introduction to the theory of neural computation
TLDR
This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.
The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks
TLDR
Simulations suggest that this method for building and training multilayer perceptrons composed of linear threshold units is efficient in terms of the numbers of units constructed, and the networks it builds can generalize over patterns not in the training set.
Learning in Artificial Neural Networks: A Statistical Perspective
  • H. White
  • Computer Science
    Neural Computation
  • 1989
TLDR
Concepts and analytical results from the literatures of mathematical statistics, econometrics, systems identification, and optimization theory relevant to the analysis of learning in artificial neural networks are reviewed.
Neural Networks: A Review from a Statistical Perspective
This paper informs a statistical readership about Artificial Neural Networks (ANNs), points out some of the links with statistical methodology and encourages cross-disciplinary research in the
Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks
  • M. Bello
  • Computer Science
    IEEE Trans. Neural Networks
  • 1992
TLDR
Sophisticated nonlinear least-squares and quasi-Newton optimization techniques are used to construct enhanced multilayer perceptron training algorithms, which are compared to the backpropagation algorithm in the context of several example problems.
30 years of adaptive neural networks: perceptron, Madaline, and backpropagation
TLDR
The history, origination, operating characteristics, and basic theory of several supervised neural-network training algorithms (including the perceptron rule, the least-mean-square algorithm, three Madaline rules, and the backpropagation technique) are described.
Learning algorithms and probability distributions in feed-forward and feed-back networks.
  • J. Hopfield
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 1987
TLDR
These learning algorithms are examined for a class of problems characterized by noisy or statistical data, in which the networks learn the relation between input data and probability distributions of answers, in simple but nontrivial networks.
Neural Networks, Principal Components, and Subspaces
  • E. Oja
  • Computer Science
    Int. J. Neural Syst.
  • 1989
TLDR
A single neuron with Hebbian-type learning for the connection weights, and with nonlinear internal feedback, has been shown to extract the statistical principal components of its stationary input pattern sequence, which yields a multi-dimensional, principal component subspace.
...
...