The following publications are possibly variants of this publication:
- ADAM: Dense Retrieval Distillation with Adaptive Dark ExamplesChongyang Tao, Chang Liu, Tao Shen 0001, Can Xu, Xiubo Geng, Binxing Jiao, Daxin Jiang. acl 2014: 11639-11651 [doi]
- LEAD: Liberal Feature-based Distillation for Dense RetrievalHao Sun, Xiao Liu, Yeyun Gong, Anlei Dong, Jingwen Lu, Yan Zhang, Linjun Yang, Rangan Majumder, Nan Duan. wsdm 2024: 655-664 [doi]
- In-Batch Negatives for Knowledge Distillation with Tightly-Coupled Teachers for Dense RetrievalSheng-Chieh Lin, Jheng-Hong Yang, Jimmy Lin. rep4nlp 2021: 163-173 [doi]
- Boot and Switch: Alternating Distillation for Zero-Shot Dense RetrievalFan Jiang, Qiongkai Xu, Tom Drummond, Trevor Cohn. emnlp 2023: 912-931 [doi]
- A Joint Link and Content Approach to Information Retrieval and DistillationY. Liu, M. Liang. wecwis 2005: 207-214 [doi]
- ReGen: Zero-Shot Text Classification via Training Data Generation with Progressive Dense RetrievalYue Yu, Yuchen Zhuang, Rongzhi Zhang, Yu Meng 0001, Jiaming Shen, Chao Zhang. acl 2023: 11782-11805 [doi]
- MTA4DPR: Multi-Teaching-Assistants Based Iterative Knowledge Distillation for Dense Passage RetrievalQixi Lu, Endong Xun, Gongbo Tang. emnlp 2024: 5871-5883 [doi]