Adaptive Contrastive Knowledge Distillation for BERT Compression

Jinyang Guo, Jiaheng Liu, Zining Wang, Yuqing Ma, Ruihao Gong, Ke Xu, Xianglong Liu 0001. Adaptive Contrastive Knowledge Distillation for BERT Compression. In Anna Rogers, Jordan L. Boyd-Graber, Naoaki Okazaki, editors, Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023. pages 8941-8953, Association for Computational Linguistics, 2023. [doi]

References

No references recorded for this publication.

Cited by

No citations of this publication recorded.