Pengcheng He, Baolin Peng, Song Wang, Yang Liu, Ruochen Xu, Hany Hassan, Yu Shi, Chenguang Zhu, Wayne Xiong, Michael Zeng 0001, Jianfeng Gao, Xuedong Huang 0001. Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2023, Toronto, Canada, July 9-14, 2023. pages 5095-5112, Association for Computational Linguistics, 2023. [doi]
Abstract is missing.