FSMRank: Feature Selection Algorithm for Learning to Rank


In recent years, there has been growing interest in learning to rank. The introduction of feature selection into different learning problems has been proven effective. These facts motivate us to investigate the problem of feature selection for learning to rank. We propose a joint convex optimization formulation which minimizes ranking errors while simultaneously conducting feature selection. This optimization formulation provides a flexible framework in which we can easily incorporate various importance measures and similarity measures of the features. To solve this optimization problem, we use the Nesterov's approach to derive an accelerated gradient algorithm with a fast convergence rate <i>O</i>(1/<i>T</i><sup>2</sup>). We further develop a generalization bound for the proposed optimization problem using the Rademacher complexities. Extensive experimental evaluations are conducted on the public LETOR benchmark datasets. The results demonstrate that the proposed method shows: 1) significant ranking performance gain compared to several feature selection baselines for ranking, and 2) very competitive performance compared to several state-of-the-art learning-to-rank algorithms.

DOI: 10.1109/TNNLS.2013.2247628

Extracted Key Phrases

7 Figures and Tables

Citations per Year

Citation Velocity: 9

Averaging 9 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@article{Lai2013FSMRankFS, title={FSMRank: Feature Selection Algorithm for Learning to Rank}, author={Hanjiang Lai and Yan Pan and Yong Tang and Rong Yu}, journal={IEEE Transactions on Neural Networks and Learning Systems}, year={2013}, volume={24}, pages={940-952} }