GreekBART: The First Pretrained Greek Sequence-to-Sequence Model

Iakovos Evdaimon, Hadi Abdine, Christos Xypolopoulos, Stamatis Outsios, Michalis Vazirgiannis, Giorgos Stamou. GreekBART: The First Pretrained Greek Sequence-to-Sequence Model. In Nicoletta Calzolari, Min-Yen Kan, Véronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue, editors, Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation, LREC/COLING 2024, 20-25 May, 2024, Torino, Italy. pages 7949-7962, ELRA and ICCL, 2024. [doi]

Abstract

Abstract is missing.