• Corpus ID: 60896462

Neural network design

@inproceedings{Hagan1995NeuralND,
  title={Neural network design},
  author={Martin T. Hagan and Howard B. Demuth and Mark H. Beale},
  year={1995}
}
This book, by the authors of the Neural Network Toolbox for MATLAB, provides a clear and detailed coverage of fundamental neural network architectures and learning rules. In it, the authors emphasize a coherent presentation of the principal neural networks, methods for training them and their applications to practical problems. Features Extensive coverage of training methods for both feedforward networks (including multilayer and radial basis networks) and recurrent networks. In addition to… 

A New Formulation for Feedforward Neural Networks

Results show that ReNN can be trained more effectively and efficiently compared to the common neural networks and the proposed regularization measure is an effective indicator of how a network would perform in terms of generalization.

Fast Learning Neural Network Using Modified Corners Algorithm

This paper uses a different type of modeling to represent data and hence solve the problem of fast learning and has taken the help of distance separation of training data and an unknown input to calculate the most probable output in the neural network.

Chapter 6 – Neural networks

Constructing Multilayer Feedforward Neural Networks to Approximate Nonlinear Functions-Examples and Justifications

This study is focused on memoryless and monotonic functions that are widely encountered in engineering mechanics applications such as those seen in the stress-strain, moment-curvature, and load-displacement relationships, as well as time histories.

A Global Algorithm for Training Multilayer Neural Networks

A global algorithm for training multilayer neural networks focused on controlling the local fields of neurons induced by the input of samples by random adaptations of the synaptic weights is presented.

Chapter 12: TRAINING RECURRENT NETWORKS FOR FILTERING AND CONTROL

This chapter introduces the Layered Digital Recurrent Network (LDR), develops a general training algorithm for this network, and demonstrates application of the LDRN to problems in controls and signal processing.

Mapping some functions and four arithmetic operations to multilayer feedforward neural networks

Details and results of mapping the four arithmetic operations as well as other functions including reciprocal, Gaussian and Mexican hat functions into multilayer feedforward neural networks with one hidden layer are provided.

A genetic approach to automatic neural network architecture optimization

This work introduces a novel strategy which is capable to generate a network topology with overfitting being avoided in the majority of the cases at affordable computational cost.

A fast constructive learning algorithm for single-hidden-layer neural networks

  • Q. ZhuG. HuangC. Siew
  • Computer Science
    ICARCV 2004 8th Control, Automation, Robotics and Vision Conference, 2004.
  • 2004
A novel fast learning algorithm called ELM for single-hidden-layer neural networks (SLFNs) has been proposed where a constructive method is used instead of a gradient-based learning algorithm, and the performance of ELM on two benchmark artificial problems is verified.
...