Pre-training Polish Transformer-Based Language Models at Scale

Slawomir Dadas, Michal Perelkiewicz, Rafal Poswiata. Pre-training Polish Transformer-Based Language Models at Scale. In Leszek Rutkowski, Rafal Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing - 19th International Conference, ICAISC 2020, Zakopane, Poland, October 12-14, 2020, Proceedings, Part II. Volume 12416 of Lecture Notes in Computer Science, pages 301-314, Springer, 2020. [doi]

Abstract

Abstract is missing.