Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization

Pengcheng He, Baolin Peng, Song Wang, Yang Liu, Ruochen Xu, Hany Hassan, Yu Shi, Chenguang Zhu, Wayne Xiong, Michael Zeng 0001, Jianfeng Gao, Xuedong Huang 0001. Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2023, Toronto, Canada, July 9-14, 2023. pages 5095-5112, Association for Computational Linguistics, 2023. [doi]

@inproceedings{HePWLXHSZX0G023,
  title = {Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization},
  author = {Pengcheng He and Baolin Peng and Song Wang and Yang Liu and Ruochen Xu and Hany Hassan and Yu Shi and Chenguang Zhu and Wayne Xiong and Michael Zeng 0001 and Jianfeng Gao and Xuedong Huang 0001},
  year = {2023},
  url = {https://aclanthology.org/2023.acl-long.279},
  researchr = {https://researchr.org/publication/HePWLXHSZX0G023},
  cites = {0},
  citedby = {0},
  pages = {5095-5112},
  booktitle = {Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2023, Toronto, Canada, July 9-14, 2023},
  editor = {Anna Rogers and Jordan L. Boyd-Graber and Naoaki Okazaki},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-959429-72-2},
}