NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application

Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Qi Liu. NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Findings of the Association for Computational Linguistics: EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 16-20 November, 2021. pages 3285-3295, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.