Pre-training Polish Transformer-Based Language Models at Scale

Slawomir Dadas, Michal Perelkiewicz, Rafal Poswiata. Pre-training Polish Transformer-Based Language Models at Scale. In Leszek Rutkowski, Rafal Scherer, Marcin Korytkowski, Witold Pedrycz, Ryszard Tadeusiewicz, Jacek M. Zurada, editors, Artificial Intelligence and Soft Computing - 19th International Conference, ICAISC 2020, Zakopane, Poland, October 12-14, 2020, Proceedings, Part II. Volume 12416 of Lecture Notes in Computer Science, pages 301-314, Springer, 2020. [doi]

Authors

Slawomir Dadas

This author has not been identified. Look up 'Slawomir Dadas' in Google

Michal Perelkiewicz

This author has not been identified. Look up 'Michal Perelkiewicz' in Google

Rafal Poswiata

This author has not been identified. Look up 'Rafal Poswiata' in Google