BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning

Songling Zhu, Ronghua Shang, Ke Tang, Songhua Xu, Yangyang Li 0001. BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning. Knowl.-Based Syst., 279:110916, November 2023. [doi]

Abstract

Abstract is missing.