Smaller but Better: Self-Paced Knowledge Distillation for Lightweight yet Effective LCMs

Yujia Chen, Yang Ye, Zhongqi Li, Yuchi Ma, Cuiyun Gao 0001. Smaller but Better: Self-Paced Knowledge Distillation for Lightweight yet Effective LCMs. Proc. ACM Softw. Eng., 2(FSE):3057-3080, 2025. [doi]

Abstract

Abstract is missing.