The following publications are possibly variants of this publication:
- Self-distillation and Uncertainty Boosting Self-supervised Monocular Depth EstimationHang Zhou, Sarah Taylor, David Greenwood 0001, Michal Mackiewicz. bmvc 2022: 7 [doi]
- Complementary Calibration: Boosting General Continual Learning With Collaborative Distillation and Self-SupervisionZhong Ji, Jin Li, Qiang Wang, Zhongfei Zhang. TIP, 32:657-667, 2023. [doi]
- Multiple instance tracking based on hierarchical maximizing bag s margin boostingChunxiao Liu, Guijin Wang, Xinggang Lin, Bobo Zeng. icassp 2011: 1193-1196 [doi]
- SEDMA: Self-Distillation with Model Aggregation for Membership PrivacyTsunato Nakai, Ye Wang, Kota Yoshida, Takeshi Fujino. popets, 2024(1):494-508, January 2024. [doi]
- Weakly supervised object detection from remote sensing images via self-attention distillation and instance-aware miningPeng Yang, Shi Zhou, Linlin Wang, Guowei Yang. mta, 83(13):39073-39095, April 2024. [doi]