Corpus ID: 60896462

Neural network design

@inproceedings{Hagan1995NeuralND,
  title={Neural network design},
  author={Martin T. Hagan and Howard B. Demuth and Mark H. Beale},
  year={1995}
}
This book, by the authors of the Neural Network Toolbox for MATLAB, provides a clear and detailed coverage of fundamental neural network architectures and learning rules. In it, the authors emphasize a coherent presentation of the principal neural networks, methods for training them and their applications to practical problems. Features Extensive coverage of training methods for both feedforward networks (including multilayer and radial basis networks) and recurrent networks. In addition to… Expand
A New Formulation for Feedforward Neural Networks
TLDR
Results show that ReNN can be trained more effectively and efficiently compared to the common neural networks and the proposed regularization measure is an effective indicator of how a network would perform in terms of generalization. Expand
A Global Optimum Approach for One-Layer Neural Networks
The article presents a method for learning the weights in one-layer feed-forward neural networks minimizing either the sum of squared errors or the maximum absolute error, measured in the inputExpand
Fast Learning Neural Network Using Modified Corners Algorithm
TLDR
This paper uses a different type of modeling to represent data and hence solve the problem of fast learning and has taken the help of distance separation of training data and an unknown input to calculate the most probable output in the neural network. Expand
Chapter 6 – Neural networks
TLDR
This chapter introduces the fundamental ideas of neural networks, a model of biological neural networks built around nodes (or processing units) that simulate the action of a neuron that is dependent on the number of neurons used. Expand
Time series prediction and channel equalizer using artificial neural networks with VLSI implementation
  • J. Mv
  • Computer Science
  • 2008
TLDR
The architecture and training procedure of a novel recurrent neural network, referred to as the multifeedbacklayer neural network (MFLNN), is described in this paper and it performed better than several networks available in the literature. Expand
Constructing Multilayer Feedforward Neural Networks to Approximate Nonlinear Functions-Examples and Justifications
TLDR
This study is focused on memoryless and monotonic functions that are widely encountered in engineering mechanics applications such as those seen in the stress-strain, moment-curvature, and load-displacement relationships, as well as time histories. Expand
A Global Algorithm for Training Multilayer Neural Networks
TLDR
A global algorithm for training multilayer neural networks focused on controlling the local fields of neurons induced by the input of samples by random adaptations of the synaptic weights is presented. Expand
A review of Hopfield neural networks for solving mathematical programming problems
TLDR
The Hopfield neural network (HNN) is one major neural network for solving optimization or mathematical programming (MP) problems and utilizes three common methods, penalty functions, Lagrange multipliers, and primal and dual methods to construct an energy function. Expand
Chapter 12: TRAINING RECURRENT NETWORKS FOR FILTERING AND CONTROL
Neural networks can be classified into recurrent and nonrecurrent catego Nonrecurrent (feedforward) networks have no feedback elements; the outp calculated directly from the input through feedforwardExpand
Mapping some functions and four arithmetic operations to multilayer feedforward neural networks
  • J. Pei, E. Mai, Joseph P. Wright
  • Computer Science, Engineering
  • SPIE Smart Structures and Materials + Nondestructive Evaluation and Health Monitoring
  • 2008
TLDR
Details and results of mapping the four arithmetic operations as well as other functions including reciprocal, Gaussian and Mexican hat functions into multilayer feedforward neural networks with one hidden layer are provided. Expand
...
1
2
3
4
5
...