CLOWER: A Pre-trained Language Model with Contrastive Learning over Word and Character Representations

Borun Chen, Hongyin Tang, Jiahao Bu, Kai Zhang, Jingang Wang, Qifan Wang, Hai-Tao Zheng 0002, Wei Wu 0014, Liqian Yu. CLOWER: A Pre-trained Language Model with Contrastive Learning over Word and Character Representations. In Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, YoungGyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na, editors, Proceedings of the 29th International Conference on Computational Linguistics, COLING 2022, Gyeongju, Republic of Korea, October 12-17, 2022. pages 3098-3108, International Committee on Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.