• Corpus ID: 60896462

Neural network design

@inproceedings{Hagan1995NeuralND,
  title={Neural network design},
  author={Martin T. Hagan and Howard B. Demuth and Mark H. Beale},
  year={1995}
}
This book, by the authors of the Neural Network Toolbox for MATLAB, provides a clear and detailed coverage of fundamental neural network architectures and learning rules. In it, the authors emphasize a coherent presentation of the principal neural networks, methods for training them and their applications to practical problems. Features Extensive coverage of training methods for both feedforward networks (including multilayer and radial basis networks) and recurrent networks. In addition to… 
A New Formulation for Feedforward Neural Networks
TLDR
Results show that ReNN can be trained more effectively and efficiently compared to the common neural networks and the proposed regularization measure is an effective indicator of how a network would perform in terms of generalization.
A Global Optimum Approach for One-Layer Neural Networks
The article presents a method for learning the weights in one-layer feed-forward neural networks minimizing either the sum of squared errors or the maximum absolute error, measured in the input
Fast Learning Neural Network Using Modified Corners Algorithm
TLDR
This paper uses a different type of modeling to represent data and hence solve the problem of fast learning and has taken the help of distance separation of training data and an unknown input to calculate the most probable output in the neural network.
Chapter 6 – Neural networks
Time series prediction and channel equalizer using artificial neural networks with VLSI implementation
  • J. Mv
  • Computer Science
  • 2008
TLDR
The architecture and training procedure of a novel recurrent neural network, referred to as the multifeedbacklayer neural network (MFLNN), is described in this paper and it performed better than several networks available in the literature.
Constructing Multilayer Feedforward Neural Networks to Approximate Nonlinear Functions-Examples and Justifications
TLDR
This study is focused on memoryless and monotonic functions that are widely encountered in engineering mechanics applications such as those seen in the stress-strain, moment-curvature, and load-displacement relationships, as well as time histories.
A Global Algorithm for Training Multilayer Neural Networks
TLDR
A global algorithm for training multilayer neural networks focused on controlling the local fields of neurons induced by the input of samples by random adaptations of the synaptic weights is presented.
Chapter 12: TRAINING RECURRENT NETWORKS FOR FILTERING AND CONTROL
TLDR
This chapter introduces the Layered Digital Recurrent Network (LDR), develops a general training algorithm for this network, and demonstrates application of the LDRN to problems in controls and signal processing.
Mapping some functions and four arithmetic operations to multilayer feedforward neural networks
TLDR
Details and results of mapping the four arithmetic operations as well as other functions including reciprocal, Gaussian and Mexican hat functions into multilayer feedforward neural networks with one hidden layer are provided.
...
1
2
3
4
5
...