The following publications are possibly variants of this publication:
- Knowledge Distillation with Metric Learning for Medical Dialogue GenerationQingqing Zhu, Pengfei Wu, Zhouxing Tan, Jiaxin Duan, Dongyan Zhao 0001, Junfei Liu. bibm 2021: 625-632 [doi]
- Accelerating Multi-Exit BERT Inference via Curriculum Learning and Knowledge DistillationShengwei Gu, Xiangfeng Luo, Xinzhi Wang 0001, Yike Guo. ijseke, 33(3):395-413, March 2023. [doi]
- Multi-Level Curriculum Learning for Multi-Turn Dialogue GenerationGuanhua Chen, Runzhe Zhan, Derek F. Wong, Lidia S. Chao. taslp, 31:3958-3967, 2023. [doi]
- Dynamic Curriculum Learning with Co-training for Medical Dialogue GenerationQingqing Zhu, Zhouxing Tan, Jiaxin Duan, Pengfei Wu, Dongyan Zhao 0001, Junfei Liu. bibm 2021: 633-640 [doi]
- A Dual-Mode Learning Mechanism Combining Knowledge-Education and Machine-LearningYichang Chen, Anpin Chen. isnn 2008: 87-96 [doi]