bert2BERT: Towards Reusable Pretrained Language Models

Cheng Chen, Yichun Yin, Lifeng Shang, Xin Jiang 0002, Yujia Qin, Fengyu Wang, Zhi Wang, Xiao Chen, Zhiyuan Liu, Qun Liu 0001. bert2BERT: Towards Reusable Pretrained Language Models. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 2134-2148, Association for Computational Linguistics, 2022. [doi]

Authors

Cheng Chen

This author has not been identified. Look up 'Cheng Chen' in Google

Yichun Yin

This author has not been identified. Look up 'Yichun Yin' in Google

Lifeng Shang

This author has not been identified. Look up 'Lifeng Shang' in Google

Xin Jiang 0002

This author has not been identified. Look up 'Xin Jiang 0002' in Google

Yujia Qin

This author has not been identified. Look up 'Yujia Qin' in Google

Fengyu Wang

This author has not been identified. Look up 'Fengyu Wang' in Google

Zhi Wang

This author has not been identified. Look up 'Zhi Wang' in Google

Xiao Chen

This author has not been identified. Look up 'Xiao Chen' in Google

Zhiyuan Liu

This author has not been identified. Look up 'Zhiyuan Liu' in Google

Qun Liu 0001

This author has not been identified. Look up 'Qun Liu 0001' in Google