HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression

Chenhe Dong, Yaliang Li, Ying Shen, Minghui Qiu. HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 3126-3136, Association for Computational Linguistics, 2021. [doi]

Abstract

Abstract is missing.