Knowledge Distillation with Source-free Unsupervised Domain Adaptation for BERT Model Compression

Jing Tian, Juan Chen, Ningjiang Chen, Lin Bai, Suqun Huang. Knowledge Distillation with Source-free Unsupervised Domain Adaptation for BERT Model Compression. In Weiming Shen 0001, Jean-Paul A. Barthès, Junzhou Luo, Adriana S. Vivacqua, Daniel Schneider 0008, Cheng Xie, Jinghui Zhang, Haibin Zhu, Kunkun Peng, Claudia Lage Rebello da Motta, editors, 26th International Conference on Computer Supported Cooperative Work in Design, CSCWD 2023, Rio de Janeiro, Brazil, May 24-26, 2023. pages 1766-1771, IEEE, 2023. [doi]

Abstract

Abstract is missing.