Towards Efficient Dialogue Pre-training with Transferable and Interpretable Latent Structure

Xueliang Zhao, Lemao Liu, Tingchen Fu, Shuming Shi 0001, Dongyan Zhao 0001, Rui Yan 0001. Towards Efficient Dialogue Pre-training with Transferable and Interpretable Latent Structure. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 10051-10063, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.