Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training

@article{Mahdi2011ReducedHN,
  title={Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training},
  author={Rami N. Mahdi and Eric C. Rouchka},
  journal={IEEE Transactions on Neural Networks},
  year={2011},
  volume={22},
  pages={673-686}
}
Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large… CONTINUE READING