## A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation

- Najdan Vukovic, Zoran Miljkovic
- Neural Networks
- 2013

Highly Influenced

@article{Mahdi2011ReducedHN, title={Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training}, author={Rami N. Mahdi and Eric C. Rouchka}, journal={IEEE Transactions on Neural Networks}, year={2011}, volume={22}, pages={673-686} }

- Published 2011 in IEEE Transactions on Neural Networks
DOI:10.1109/TNN.2011.2109736

Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance. Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large… CONTINUE READING

### Presentations referencing similar topics