Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization

Pengcheng He, Baolin Peng, Song Wang, Yang Liu, Ruochen Xu, Hany Hassan, Yu Shi, Chenguang Zhu, Wayne Xiong, Michael Zeng 0001, Jianfeng Gao, Xuedong Huang 0001. Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2023, Toronto, Canada, July 9-14, 2023. pages 5095-5112, Association for Computational Linguistics, 2023. [doi]

Authors

Pengcheng He

This author has not been identified. Look up 'Pengcheng He' in Google

Baolin Peng

This author has not been identified. Look up 'Baolin Peng' in Google

Song Wang

This author has not been identified. It may be one of the following persons: Look up 'Song Wang' in Google

Yang Liu

This author has not been identified. Look up 'Yang Liu' in Google

Ruochen Xu

This author has not been identified. Look up 'Ruochen Xu' in Google

Hany Hassan

This author has not been identified. Look up 'Hany Hassan' in Google

Yu Shi

This author has not been identified. Look up 'Yu Shi' in Google

Chenguang Zhu

This author has not been identified. Look up 'Chenguang Zhu' in Google

Wayne Xiong

This author has not been identified. Look up 'Wayne Xiong' in Google

Michael Zeng 0001

This author has not been identified. Look up 'Michael Zeng 0001' in Google

Jianfeng Gao

This author has not been identified. Look up 'Jianfeng Gao' in Google

Xuedong Huang 0001

This author has not been identified. Look up 'Xuedong Huang 0001' in Google