Sciweavers

IR
2010

Gradient descent optimization of smoothed information retrieval metrics

13 years 3 months ago
Gradient descent optimization of smoothed information retrieval metrics
Abstract Most ranking algorithms are based on the optimization of some loss functions, such as the pairwise loss. However, these loss functions are often different from the criteria that are adopted to measure the quality of the web page ranking results. To overcome this problem, we propose an algorithm which aims at directly optimizing popular measures such as the Normalized Discounted Cumulative Gain (NDCG) and the Average Precision (AP). The basic idea is to minimize a smooth approximation of these measures with gradient descent. Crucial to this kind of approach is the choice of the smoothing factor. We provide various theoretical analysis on that choice and propose an annealing algorithm to iteratively minimize a less and less smoothed approximation of the measure of interest. Results on the Letor benchmark datasets show that the proposed algorithm achieves state-of-the-art performances.
Olivier Chapelle, Mingrui Wu
Added 28 Jan 2011
Updated 28 Jan 2011
Type Journal
Year 2010
Where IR
Authors Olivier Chapelle, Mingrui Wu
Comments (0)