Abstract is missing.
- Overview of the 2019 Open-Source IR Replicability Challenge (OSIRRC 2019)Ryan Clancy, Nicola Ferro 0001, Claudia Hauff, Jimmy Lin, Tetsuya Sakai, Ze Zhong Wu. 1-7 [doi]
- STELLA: Towards a Framework for the Reproducibility of Online Search ExperimentsTimo Breuer, Philipp Schaer, Narges Tavakolpoursaleh, Johann Schaible, Benjamin Wolff, Bernd Müller. 8-11 [doi]
- Let's measure run time! Extending the IR replicability infrastructure to include performance aspectsSebastian Hofstätter, Allan Hanbury. 12-16 [doi]
- Reproducible IR needs an (IR) (Graph) Query LanguageChris Kamphuis, Arjen P. de Vries. 17-20 [doi]
- Entity Retrieval Docker Image for OSIRRC at SIGIR 2019Negar Arabzadeh. 21-25 [doi]
- Dockerising Terrier for The Open-Source IR Replicability Challenge (OSIRRC 2019)Arthur Barbosa Câmara, Craig Macdonald. 26-30 [doi]
- Dockerizing Automatic Routing Runs for The Open-Source IR Replicability Challenge (OSIRRC 2019)Timo Breuer, Philipp Schaer. 31-35 [doi]
- University of Waterloo Docker Images for OSIRRC at SIGIR 2019Ryan Clancy, Zeynep Akkalyoncu Yilmaz, Ze Zhong Wu, Jimmy Lin. 36 [doi]
- A Docker-Based Replicability Study of a Neural Information Retrieval ModelNicola Ferro 0001, Stefano Marchesin 0001, Alberto Purpura, Gianmaria Silvello. 37-43 [doi]
- Dockerizing Indri for OSIRRC 2019Claudia Hauff. 44-46 [doi]
- The OldDog Docker Image for OSIRRC at SIGIR 2019Chris Kamphuis, Arjen P. de Vries. 47-49 [doi]
- PISA: Performant Indexes and Search for AcademiaAntonio Mallia, Michal Siedlaczek, Joel Mackenzie, Torsten Suel. 50-56 [doi]
- ielab at the Open-Source IR Replicability Challenge 2019Harrisen Scells, Guido Zuccon. 57-61 [doi]
- BM25 Pseudo Relevance Feedback Using Anserini at Waseda UniversityZhaohao Zeng, Tetsuya Sakai. 62-63 [doi]