Laporte, Léa and Flamary, Rémi and Canu, Stéphane and Déjean, Sébastien and Mothe, Josiane Non-convex Regularizations for Feature Selection in Ranking with Sparse SVM. (2014) IEEE Transactions on Neural Networks, 25 (6). 1118-1130. ISSN 1045-9227
|
(Document in English)
PDF (Author's version) - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader 1MB |
Official URL: http://dx.doi.org/10.1109/TNNLS.2013.2286696
Abstract
Feature selection in learning to rank has recently emerged as a crucial issue. Whereas several preprocessing approaches have been proposed, only a few have focused on integrating feature selection into the learning process. In this paper, we propose a general framework for feature selection in learning to rank using support vector machines with a sparse regularization term. We investigate both classical convex regularizations, such as ℓ1 or weighted ℓ1, and nonconvex regularization terms, such as log penalty, minimax concave penalty, or ℓp pseudo-norm with p<;1. Two algorithms are proposed: the first, an accelerated proximal approach for solving the convex problems, and, the second, a reweighted ℓ1 scheme to address nonconvex regularizations. We conduct intensive experiments on nine datasets from Letor 3.0 and Letor 4.0 corpora. Numerical results show that the use of nonconvex regularizations we propose leads to more sparsity in the resulting models while preserving the prediction performance. The number of features is decreased by up to a factor of 6 compared to the ℓ1 regularization. In addition, the software is publicly available on the web.
Repository Staff Only: item control page