MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding

Jia-Chen Gu, Chongyang Tao, Zhen-Hua Ling, Can Xu, Xiubo Geng, Daxin Jiang. MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding. In Chengqing Zong, Fei Xia, Wenjie Li 0002, Roberto Navigli, editors, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021. pages 3682-3692, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.