PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining

Machel Reid, Mikel Artetxe. PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining. In Spandana Gella, He He 0001, Bodhisattwa Prasad Majumder, Burcu Can, Eleonora Giunchiglia, Samuel Cahyawijaya, Sewon Min, Maximilian Mozes, Xiang Lorraine Li, Isabelle Augenstein, Anna Rogers, KyungHyun Cho, Edward Grefenstette, Laura Rimell, Chris Dyer, editors, Proceedings of the 7th Workshop on Representation Learning for NLP, RepL4NLP@ACL 2022, Dublin, Ireland, May 26, 2022. pages 20-28, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.