Anne Schuth. In Proceedings of DIR'15, 2015.


The last part of this thesis is of a different nature than the earlier two parts. As opposed to the earlier chapters, we no longer study algorithms. Progress in information retrieval research has always been driven by a combination of algorithms, shared resources, and evaluation. In the last part we focus on the latter two. We introduce a new shared resource and a new evaluation paradigm. Firstly, we propose Lerot. Lerot is an online evaluation framework that allows us to simulate users interacting with a search engine. Our implementation has been released as open source software and is currently being used by researchers around the world. Secondly we introduce OpenSearch, a new evaluation paradigm involving real users of real search engines. We describe an implementation of this paradigm that has already been widely adopted by the research community through challenges at CLEF and TREC.It is time for a paradigm shift. Cranfield style evaluation has served us well for many years, but relevance assessments from judges are very di↵erent from what actually satisfies users. We should move to online evaluation, were we use implicit user signals to validate retrieval systems. A major issue for academics however has been the lack of a system with users. Open Search changes this. Open Search opens up real search engines with real users for research. Open Search allows researchers to expose their retrieval system to real, unsuspecting users with real information needs that can really be satisfied.




  title = {OpenSearch},
  author = {Anne Schuth},
  year = {2015},
  booktitle = {Proceedings of DIR'15}