Co-training and Co-distillation for Quality Improvement and Compression of Language Models

Hayeon Lee, Rui Hou 0007, Jongpil Kim, Davis Liang, Hongbo Zhang, Sung Ju Hwang, Alexander Min. Co-training and Co-distillation for Quality Improvement and Compression of Language Models. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 7458-7467, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.