Corpus ID: 60563397

Neural networks for pattern recognition

@inproceedings{Bishop1995NeuralNF,
  title={Neural networks for pattern recognition},
  author={C. Bishop},
  year={1995}
}
From the Publisher: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization… Expand

Figures and Topics from this paper

Neural Networks for Pattern Recognition, Image and Signal Processing
In this paper Neural Networks are presented in the context of Statistical Pattern Recognition, focusing the attention on all the steps needed to classify and interpolate input data. StandardExpand
Pattern Recognition and Neural Networks
TLDR
Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks in this self-contained account. Expand
Functional networks training algorithm for statistical pattern recognition
  • E. A. El-Sebakhy
  • Computer Science
  • Proceedings. ISCC 2004. Ninth International Symposium on Computers And Communications (IEEE Cat. No.04TH8769)
  • 2004
TLDR
This work uses functional equations to approximate the neuron functions, which allow a wide class of functions to be presented and the steps of working with functional networks and the structural learning are proposed. Expand
Ranking Pattern Recognition Features for Neural Networks
TLDR
This paper presents a practical technique for ranking features in terms of significance for a neural-net pattern recognizer, and provides the results of applying this clamping technique to a small selection of problems that demonstrate its practical worth. Expand
A neural network model with bounded-weights for pattern classification
TLDR
A new neural network model is proposed based on the concepts of multi-layer perceptrons, radial basis functions, and support vector machines, which does not require that kernel functions satisfy Mercer's condition, and it can be readily extended to multi-class classification. Expand
Principal Feature Networks for Pattern Recognition
  • Qi Li
  • Computer Science
  • 2012
TLDR
A different approach for neural network training and construction is introduced that was developed by the author and Tufts and named the principal feature network (PFN), which is an analytical method to construct a classifier or recognizer. Expand
A closed-form neural network for discriminatory feature extraction from high-dimensional data
TLDR
A new neural network for data discrimination in pattern recognition applications that is theoretically shown to provide nonlinear transforms of the input data that are more general than those provided by other nonlinear multilayer perceptron neural network and support-vector machine techniques for cases involving high-dimensional inputs. Expand
Algorithmic Synthesis in Neural Network Training for Pattern Recognition
A large number of theoretical results establish the potential of Artificial Neural Networks (ANNs) as universal function approximators. ANNs importance in this context is that they offer a veryExpand
Prefiltering for pattern recognition using wavelet transform and neural networks
TLDR
This chapter focuses on pattern recognition using wavelet transform and neural networks using principal component analysis, Fourier transform, and other algorithms that allow the selection of the best parameters. Expand
Circular backpropagation networks for classification
TLDR
The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by backpropagation. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 258 REFERENCES
Introduction to Statistical Pattern Recognition-Second Edition
TLDR
This completely revised second edition presents an introduction to statistical pattern recognition, which is appropriate as a text for introductory courses in pattern recognition and as a reference book for workers in the field. Expand
Probabilistic neural networks
TLDR
A probabilistic neural network that can compute nonlinear decision boundaries which approach the Bayes optimal is formed, and a fourlayer neural network of the type proposed can map any input pattern to any number of classifications. Expand
The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks
TLDR
Simulations suggest that this method for building and training multilayer perceptrons composed of linear threshold units is efficient in terms of the numbers of units constructed, and the networks it builds can generalize over patterns not in the training set. Expand
Introduction to the theory of neural computation
TLDR
This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning. Expand
Learning in Artificial Neural Networks: A Statistical Perspective
  • H. White
  • Computer Science
  • Neural Computation
  • 1989
The premise of this article is that learning procedures used to train artificial neural networks are inherently statistical techniques. It follows that statistical theory can provide considerableExpand
Bayesian learning for neural networks
TLDR
Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods. Expand
Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition
  • J. Bridle
  • Computer Science
  • NATO Neurocomputing
  • 1989
TLDR
Two modifications are explained: probability scoring, which is an alternative to squared error minimisation, and a normalised exponential (softmax) multi-input generalisation of the logistic non- linearity of feed-forward non-linear networks with multiple outputs. Expand
Enhanced training algorithms, and integrated training/architecture selection for multilayer perceptron networks
  • M. Bello
  • Computer Science, Medicine
  • IEEE Trans. Neural Networks
  • 1992
TLDR
Sophisticated nonlinear least-squares and quasi-Newton optimization techniques are used to construct enhanced multilayer perceptron training algorithms, which are compared to the backpropagation algorithm in the context of several example problems. Expand
Neural Networks: A Review from a Statistical Perspective
This paper informs a statistical readership about Artificial Neural Networks (ANNs), points out some of the links with statistical methodology and encourages cross-disciplinary research in theExpand
Learning algorithms and probability distributions in feed-forward and feed-back networks.
  • J. Hopfield
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
  • 1987
TLDR
These learning algorithms are examined for a class of problems characterized by noisy or statistical data, in which the networks learn the relation between input data and probability distributions of answers, in simple but nontrivial networks. Expand
...
1
2
3
4
5
...