The following publications are possibly variants of this publication:
- Boosting Prompt-Based Few-Shot Learners Through Out-of-Domain Knowledge DistillationXiaoqing Chen, Chengyu Wang 0001, Junwei Dong, Minghui Qiu, Liang Feng, Jun Huang 0007. icassp 2023: 1-5 [doi]
- Prompt-Distiller: Few-Shot Knowledge Distillation for Prompt-Based Language Learners with Dual Contrastive LearningBoyu Hou, Chengyu Wang 0001, Xiaoqing Chen, Minghui Qiu, Liang Feng, Jun Huang 0007. icassp 2023: 1-5 [doi]
- Distilling a Powerful Student Model via Online Knowledge DistillationShaojie Li, Mingbao Lin, Yan Wang 0059, Yongjian Wu, Yonghong Tian 0001, Ling Shao 0001, Rongrong Ji. tnn, 34(11):8743-8752, November 2023. [doi]
- Dimension-Prompts Boost Commonsense ConsolidationJiazhan Feng, Chongyang Tao, Tao Shen, Chang Liu, Dongyan Zhao 0001. sigir 2023: 1934-1938 [doi]