CharBERT: Character-aware Pre-trained Language Model

Wentao Ma, Yiming Cui, Chenglei Si, Ting Liu 0001, Shijin Wang 0001, Guoping Hu. CharBERT: Character-aware Pre-trained Language Model. In Donia Scott, NĂºria Bel, Chengqing Zong, editors, Proceedings of the 28th International Conference on Computational Linguistics, COLING 2020, Barcelona, Spain (Online), December 8-13, 2020. pages 39-50, International Committee on Computational Linguistics, 2020. [doi]

Abstract

Abstract is missing.