Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression

Zhao Yang, Yuanzhe Zhang, Dianbo Sui, Yiming Ju, Jun Zhao 0001, Kang Liu 0001. Explanation Guided Knowledge Distillation for Pre-trained Language Model Compression. ACM Trans. Asian Lang. Inf. Process., 23(2), February 2024. [doi]

Abstract

Abstract is missing.