Modeling Content Importance for Summarization with Pre-trained Language Models

Liqiang Xiao, Lu Wang 0008, Hao He 0007, Yaohui Jin. Modeling Content Importance for Summarization with Pre-trained Language Models. In Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu, editors, Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020. pages 3606-3611, Association for Computational Linguistics, 2020. [doi]

@inproceedings{XiaoWHJ20-0,
  title = {Modeling Content Importance for Summarization with Pre-trained Language Models},
  author = {Liqiang Xiao and Lu Wang 0008 and Hao He 0007 and Yaohui Jin},
  year = {2020},
  url = {https://www.aclweb.org/anthology/2020.emnlp-main.293/},
  researchr = {https://researchr.org/publication/XiaoWHJ20-0},
  cites = {0},
  citedby = {0},
  pages = {3606-3611},
  booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, November 16-20, 2020},
  editor = {Bonnie Webber and Trevor Cohn and Yulan He and Yang Liu},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-952148-60-6},
}