Sciweavers

Share
MM
2009
ACM

Comparing fact finding tasks and user survey for evaluating a video browsing tool

8 years 10 months ago
Comparing fact finding tasks and user survey for evaluating a video browsing tool
There are still no established methods for the evaluation of browsing and exploratory search tools. In the (multimedia) information retrieval community evaluations following the Cranfield paradigm (as e.g. used in TRECVID) have been widely adopted. We have applied two TRECVID style fact finding approaches (retrieval and question answering tasks) and a user survey to the evaluation of a video browsing tool. We analyze the correlation between the results of the different methods, whether different aspects can be evaluated independently with the survey, and if a learning effect can be measured with the different methods. The results show that the retrieval task correlates better with the user experience according to the survey than the question answering tasks. It turns out that the survey rather measures the general user experience while different aspects of the usability cannot be analyzed independently. Categories and Subject Descriptors H.5.1 [Information Interfaces and Present...
Werner Bailer, Herwig Rehatschek
Added 28 May 2010
Updated 28 May 2010
Type Conference
Year 2009
Where MM
Authors Werner Bailer, Herwig Rehatschek
Comments (0)
books