Neural-network feature selector

@article{Setiono1997NeuralnetworkFS,
  title={Neural-network feature selector},
  author={Rudy Setiono and Huan Liu},
  journal={IEEE transactions on neural networks},
  year={1997},
  volume={8 3},
  pages={
          654-62
        }
}
  • R. Setiono, Huan Liu
  • Published 1 May 1997
  • Computer Science
  • IEEE transactions on neural networks
Feature selection is an integral part of most learning algorithms. [] Key Method By adding a penalty term to the error function of the network, redundant network connections can be distinguished from those relevant ones by their small weights when the network training process has been completed. A simple criterion to remove an attribute based on the accuracy rate of the network is developed. The network is retrained after removal of an attribute, and the selection process is repeated until no attribute meets…
Feature selection: a neural approach
  • G. Castellano, A. Fanelli
  • Computer Science
    IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
  • 1999
TLDR
Experimental results over a well-known classification problem show the feasibility of the proposed approach to feature selection based on neural network pruning and encourage its application to other classification tasks.
Feature selection method using neural network
  • M. Tico, V. Onnia, J. Saarinen
  • Computer Science
    Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205)
  • 2001
TLDR
A simple method for feature selection using feedforward neural networks is presented, which reduces the size of the feature space significantly and improves classification accuracy.
Input feature selection for classification problems
TLDR
It is demonstrated is that the proposed method can provide the performance of the ideal greedy selection algorithm when information is distributed uniformly and should prove to be a useful method in selecting features for classification problems.
Feature selection based on extreme learning machine
TLDR
This work proposes a feature selection algorithm which uses a feature ranking criterion to measure the significance of a feature by computing the aggregate difference of the outputs of the probabilistic SLFN with and without the feature.
Improved mutual information feature selector for neural networks in supervised learning
  • Nojun Kwak, Chong-Ho Choi
  • Computer Science
    IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
  • 1999
TLDR
An algorithm of feature selection that makes more careful use of the mutual informations between input attributes and others than the mutual information feature selector (MIFS).
Diversity-Based Feature Selection from Neural Network with Low Computational Cost
TLDR
The proposed diversity-based feature selection method (DFSM) can therefore significantly reduce the size of hidden layer priori to feature selection process without degrading the network performance.
Feature Selection for Modular Neural Network Classifiers
TLDR
Two feature selection techniques— Relative Importance Factor (RIF) and Relative FLD Weight Analysis (RFWA) for modular neural network classifiers are proposed and show that these techniques can successfully detect the irrelevant features in each module and improve accuracy while reducing computation effort.
Feature selection with neural networks
Selecting salient features for classification based on neural network committees
An Incremental Approach to MSE-Based Feature Selection
TLDR
Four batch removal methods based on classifier error rate have been developed to discard irrelevant features and reduce the computational complexity involved in searching among a large number of possible solutions significantly.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 36 REFERENCES
Using the ADAP Learning Algorithm to Forecast the Onset of Diabetes Mellitus
TLDR
Testing the ability of an early neural network model, ADAP, to forecast the onset of diabetes mellitus in a high risk population of Pima Indians and comparing the results with those obtained from logistic regression and linear perceptron models using precisely the same training and forecasting sets.
A Penalty-Function Approach for Pruning Feedforward Neural Networks
TLDR
The effectiveness of this penalty function for pruning feedforward neural network by weight elimination is tested on three well-known problems: the contiguity problem, the parity problems, and the monks problems.
Learning Algorithms for Connectionist Networks: Applied Gradient Methods of Nonlinear Optimization
TLDR
It is shown that in plateau regions of relatively constant gradient, the momentum term acts to increase the step size by a factor of 1/1-ÎĽ, where ÎĽ is the momentumTerm, and in valley regions with steep sides,The momentum constant acts to focus the search direction toward the local minimum by averaging oscillations in the gradient.
Programs for Machine Learning
TLDR
In his new book, C4.5: Programs for Machine Learning, Quinlan has put together a definitive, much needed description of his complete system, including the latest developments, which will be a welcome addition to the library of many researchers and students.
Database Mining: A Performance Perspective
TLDR
The authors' perspective of database mining as the confluence of machine learning techniques and the performance emphasis of database technology is presented and an algorithm for classification obtained by combining the basic rule discovery operations is given.
Improving the Convergence of the Backpropagation Algorithm Using Learning Rate Adaptation Methods
This article focuses on gradient-based backpropagation algorithms that use either a common adaptive learning rate for all weights or an individual adaptive learning rate for each weight and apply the…
A Neural Network Construction Algorithm which Maximizes the Likelihood Function
TLDR
A new method for constructing a feedforward neural network is proposed that starts with a single hidden unit and more units are added to the hidden layer one at a time until a network that is suitable for inference is constructed.
Concept acquisition through representational adjustment
TLDR
This thesis promotes the hypothesis that the necessary abstractions can be learned and presents a model that relies on a weighted, symbolic description of concepts that should scale-up to larger tasks than those studied and have a number of potential applications.
...
1
2
3
4
...