DOCmT5: Document-Level Pretraining of Multilingual Language Models

Chia-Hsuan Lee 0001, Aditya Siddhant, Viresh Ratnakar, Melvin Johnson. DOCmT5: Document-Level Pretraining of Multilingual Language Models. In Marine Carpuat, Marie-Catherine de Marneffe, Iván Vladimir Meza Ruíz, editors, Findings of the Association for Computational Linguistics: NAACL 2022, Seattle, WA, United States, July 10-15, 2022. pages 425-437, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.