The following publications are possibly variants of this publication:
- Efficient Training of Lightweight Neural Networks Using Online Self-Acquired Knowledge DistillationMaria Tzelepi, Anastasios Tefas. icmcs 2021: 1-6 [doi]
- Online Subclass Knowledge DistillationMaria Tzelepi, Nikolaos Passalis, Anastasios Tefas. eswa, 181:115132, 2021. [doi]
- Switchable Online Knowledge DistillationBiao Qian, Yang Wang 0023, Hongzhi Yin, Richang Hong, Meng Wang 0001. eccv 2022: 449-466 [doi]
- Probabilistic online self-distillationMaria Tzelepi, Nikolaos Passalis, Anastasios Tefas. ijon, 493:592-604, 2022. [doi]
- Online Knowledge Distillation with Diverse PeersDefang Chen, Jian-Ping Mei, Can Wang 0001, Yan Feng, Chun Chen 0001. AAAI 2020: 3430-3437 [doi]