Chen-Chia Chuang

Learn More
Support vector regression (SVR) employs the support vector machine (SVM) to tackle problems of function approximation and regression estimation. SVR has been shown to have good robust properties against noise. When the parameters used in SVR are improperly selected, overfitting phenomena may still occur. However, the selection of various parameters is not(More)
In this paper, the annealing robust radial basis function networks (ARRBFNs) are proposed to improve the problems of the robust radial basis function networks (RBFNs) for function approximation with outliers. Firstly, a support vector regression (SVR) approach is proposed to determine an initial structure of ARRBFNs in this paper. Because an SVR approach is(More)
Multilayer feedforward neural networks are often referred to as universal approximators. Nevertheless, if the used training data are corrupted by large noise, such as outliers, traditional backpropagation learning schemes may not always come up with acceptable performance. Even though various robust learning algorithms have been proposed in the literature,(More)
The Takagi–Sugeno–Kang (TSK) type of fuzzy models has attracted a great attention of the fuzzy modeling community due to their good performance in various applications. Various approaches for modeling TSK fuzzy rules have been proposed in the literature. Most of them define their fuzzy subspaces based on the idea of training data being close enough instead(More)
This paper introduces a new structure of radial basis function networks (RBFNs) that can successfully model symbolic interval-valued data. In the proposed structure, to handle symbolic interval data, the Gaussian functions required in the RBFNs are modified to consider interval distance measure, and the synaptic weights of the RBFNs are replaced by linear(More)