Strategies for constructive neural networks and its application to regression models

  • Jifu Nong
  • Published 2012 in
    2012 International Conference on Computer Science…

Abstract

Regression problem is an important application area for neural networks (NNs). Among a large number of existing NN architectures, the feedforward NN (FNN) paradigm is one of the most widely used structures. Although one-hidden-layer feedforward neural networks (OHL-FNNs) have simple structures, they possess interesting representational and learning capabilities. In this paper, we are interested particularly in incremental constructive training of OHL-FNNs. In the proposed incremental constructive training schemes for an OHL-FNN, input-side training and output-side training may be separated in order to reduce the training time. A new technique is proposed to scale the error signal during the constructive learning process to improve the input-side training efficiency and to obtain better generalization performance. Two pruning methods for removing the input-side redundant connections have also been applied. Numerical simulations demonstrate the potential and advantages of the proposed strategies when compared to other existing techniques in the literature.

2 Figures and Tables

Cite this paper

@article{Nong2012StrategiesFC, title={Strategies for constructive neural networks and its application to regression models}, author={Jifu Nong}, journal={2012 International Conference on Computer Science and Information Processing (CSIP)}, year={2012}, pages={197-201} }