The need for an objective and automated way of evaluating the performance of different ranking/reranking methods is becoming increasingly important in the web search/meta search domain. There are various methods for ranking search results ranging from traditional information retrieval approaches to more recent methods based on link analysis and other quality measures that can be derived from the documents. There are also a number of strategies for combining different heuristics and answers from multiple experts. With all of these possibilities it is becoming increasingly difficult to find the best parameters, the best method, or the best mixture of methods that will maximize the quality for a particular query type or domain. This paper addresses the problem of automatically comparing the quality of the ordering of documents that are presented to the user as a sorted list according to believed relevance for a given topic or query. We introduce the average position of user clicks metric as an implicit, automated, and non-intrusive way of evaluating ranking methods. We also discuss under which situations and assumptions this metric can be used objectively by addressing various bias sources. Experiments performed in our meta search engine suggests that, this approach has the potential to sample a wide range of query types and users with greater statistical significance compared to methods that rely on explicit user judgements.
Uygar Oztekin, B.; Karypis, George; Kumar, Vipin.
Average Position of User Clicks as an Automated and Non-Intrusive Way of Evaluating Ranking Methods.
Retrieved from the University of Minnesota Digital Conservancy,
Content distributed via the University of Minnesota's Digital Conservancy may be subject to additional license and use restrictions applied by the depositor.