Integer programming models for feature selection: New extensions and a randomized solution algorithm


Feature selection methods are used in machine learning and data analysis to select a subset of features that may be successfully used in the construction of a model for the data. These methods are applied under the assumption that often many of the available features are redundant for the purpose of the analysis. In this paper, we focus on a particular method for feature selection in supervised learning problems, based on a linear programming model with integer variables. For the solution of the optimization problem associated with this approach, we propose a novel robust metaheuristics algorithm that relies on a Greedy Randomized Adaptive Search Procedure, extended with the adoption of short memory and a local search strategy. The performances of our heuristic algorithm are successfully compared with those of well-established feature selection methods, both on simulated and real data from biological applications. The obtained results suggest that our method is particularly suited for problemswith a very large number of binary or categorical features. © 2015 Elsevier B.V. and Association of European Operational Research Societies (EURO) within the International Federation of Operational Research Societies (IFORS). All rights reserved.

DOI: 10.1016/j.ejor.2015.09.051
Citations per Year

Citation Velocity: 8

Averaging 8 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@article{Bertolazzi2016IntegerPM, title={Integer programming models for feature selection: New extensions and a randomized solution algorithm}, author={Paola Bertolazzi and Giovanni Felici and Paola Festa and Giulia Fiscon and Emanuel Weitschek}, journal={European Journal of Operational Research}, year={2016}, volume={250}, pages={389-399} }