The following publications are possibly variants of this publication:
- Dynamic Micro-Expression Recognition Using Knowledge DistillationBo Sun 0006, Siming Cao, Dongliang Li, Jun He 0009, Lejun Yu. taffco, 13(2):1037-1043, 2022. [doi]
- A General Dynamic Knowledge Distillation Method for Visual AnalyticsZhigang Tu 0001, Xiangjian Liu, Xuan Xiao. TIP, 31:6517-6531, 2022. [doi]
- Dynamic PDGAN: discriminator-boosted knowledge distillation for StyleGANsYuesong Tian, Li Shen 0008, Xiang Tian, Zhifeng Li, Yaowu Chen. jei, 33(1), 2023. [doi]
- Dynamic Refining Knowledge Distillation Based on Attention MechanismXuan Peng, Fang Liu. pricai 2022: 45-58 [doi]
- Dynamic Knowledge Distillation for Pre-trained Language ModelsLei Li, Yankai Lin, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun 0001. emnlp 2021: 379-389 [doi]
- Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge DistillationMartin Zong, Zengyu Qiu, Xinzhu Ma, Kunlin Yang, Chunya Liu, Jun Hou, Shuai Yi, Wanli Ouyang. iclr 2023: [doi]