Learn More
Data sets ordinarily includes a huge number of attributes, with irrelevant and redundant attributes. Redundant and irrelevant attributes might minimize the classification accuracy because of the huge search space. The main goal of attribute reduction is choose a subset of relevant attributes from a huge number of available attributes to obtain comparable or(More)
Multi-Layer Perceptron (MLP) is one of the Feed-Forward Neural Networks (FFNNs) types. Searching for weights and biases in MLP is important to achieve minimum training error. In this paper, Moth-Flame Optimizer (MFO) is used to train Multi-Layer Perceptron (MLP). MFO-MLP is used to search for the weights and biases of the MLP to achieve minimum error and(More)
In this paper, Ant Lion Optimizer (ALO) was presented to train Multi-Layer Perceptron (MLP). ALO was used to find the weights and biases of the MLP to achieve a minimum error and a high classification rate. Four standard classification datasets were used to benchmark the performance of the proposed method. In addition, the performance of the proposed method(More)
Flower pollination algorithm (FPA) optimization is a new evolutionary computation technique that inspired from the pollination process of flowers. In this paper, a model for multi-objective feature selection based on flower pollination algorithm (FPA) optimization hybrid with rough set is proposed. The proposed model exploits the capabilities of(More)