VarMAE: Pre-training of Variational Masked Autoencoder for Domain-adaptive Language Understanding

Dou Hu 0001, Xiaolong Hou, Xiyang Du, Mengyuan Zhou, Lianxin Jiang, Yang Mo, Xiaofeng Shi. VarMAE: Pre-training of Variational Masked Autoencoder for Domain-adaptive Language Understanding. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 6276-6286, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.