bert2BERT: Towards Reusable Pretrained Language Models

Cheng Chen, Yichun Yin, Lifeng Shang, Xin Jiang 0002, Yujia Qin, Fengyu Wang, Zhi Wang, Xiao Chen, Zhiyuan Liu, Qun Liu 0001. bert2BERT: Towards Reusable Pretrained Language Models. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 2134-2148, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.