HadSkip: Homotopic and Adaptive Layer Skipping of Pre-trained Language Models for Efficient Inference

Haoyu Wang 0004, Yaqing Wang, Tianci Liu, Tuo Zhao, Jing Gao 0004. HadSkip: Homotopic and Adaptive Layer Skipping of Pre-trained Language Models for Efficient Inference. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 4283-4294, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.