Sciweavers

CORR
2010
Springer

Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation

13 years 4 months ago
Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation
This paper is about an information retrieval evaluation on three different retrieval-supporting services. All three services were designed to compensate typical problems that arise in metadata-driven Digital Libraries, which are not adequately handled by a simple tf-idf based retrieval. The services are: (1) a co-word analysis based query expansion mechanism and re-ranking via (2) Bradfordizing and (3) author centrality. The services are evaluated with relevance assessments conducted by 73 information science students. Since the students are neither information professionals nor domain experts the question of inter-rater agreement is taken into consideration. Two important implications emerge: (1) the inter-rater agreement rates were mainly fair to moderate and (2) after a data-cleaning step which erased the assessments with poor agreement rates the evaluation data shows that the three retrieval services returned disjoint but still relevant result sets.
Philipp Schaer, Philipp Mayr, Peter Mutschke
Added 09 Dec 2010
Updated 09 Dec 2010
Type Journal
Year 2010
Where CORR
Authors Philipp Schaer, Philipp Mayr, Peter Mutschke
Comments (0)