Mixture-of-Linguistic-Experts Adapters for Improving and Interpreting Pre-trained Language Models

Raymond Li, Gabriel Murray, Giuseppe Carenini. Mixture-of-Linguistic-Experts Adapters for Improving and Interpreting Pre-trained Language Models. In Houda Bouamor, Juan Pino 0001, Kalika Bali, editors, Findings of the Association for Computational Linguistics: EMNLP 2023, Singapore, December 6-10, 2023. pages 9456-9469, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.