The following publications are possibly variants of this publication:
- An Information Distillation Framework for Extractive SummarizationKuan-Yu Chen, Shih-Hung Liu, Berlin Chen, Hsin-Min Wang. taslp, 26(1):161-170, 2018. [doi]
- DistilSum: : Distilling the Knowledge for Extractive SummarizationRuipeng Jia, Yanan Cao, Haichao Shi, Fang Fang, Yanbing Liu, Jianlong Tan. CIKM 2020: 2069-2072 [doi]
- UniMS: A Unified Framework for Multimodal Summarization with Knowledge DistillationZhengkun Zhang, Xiaojun Meng, Yasheng Wang, Xin Jiang, Qun Liu, Zhenglu Yang. AAAI 2022: 11757-11764 [doi]
- KD-VSUM: A Vision Guided Models for Multimodal Abstractive Summarization with Knowledge DistillationZehong Zheng, Changlong Li, Wenxin Hu, Su Wang. ijcnn 2024: 1-8 [doi]
- A Knowledge Graph Summarization Model Integrating Attention Alignment and Momentum DistillationZhao Wang, Xia Zhao. jaciii, 29(1):205-214, January 2025. [doi]