mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language Models

Peiqin Lin, Chengzhi Hu, Zheyu Zhang, André F. T. Martins, Hinrich Schütze. mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language Models. In Yvette Graham, Matthew Purver, editors, Findings of the Association for Computational Linguistics: EACL 2024, St. Julian's, Malta, March 17-22, 2024. pages 276-310, Association for Computational Linguistics, 2024. [doi]

Abstract

Abstract is missing.