TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations at Twitter

Xinyang Zhang 0002, Yury Malkov, Omar Florez, Serim Park, Brian McWilliams, Jiawei Han 0001, Ahmed El-Kishky. TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations at Twitter. In Ambuj Singh, Yizhou Sun, Leman Akoglu, Dimitrios Gunopulos, Xifeng Yan, Ravi Kumar 0001, Fatma Ozcan, Jieping Ye, editors, Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2023, Long Beach, CA, USA, August 6-10, 2023. pages 5597-5607, ACM, 2023. [doi]

Abstract

Abstract is missing.