Patient Knowledge Distillation for BERT Model Compression

Siqi Sun, Yu Cheng, Zhe Gan, Jingjing Liu. Patient Knowledge Distillation for BERT Model Compression. In Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan 0001, editors, Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China, November 3-7, 2019. pages 4322-4331, Association for Computational Linguistics, 2019. [doi]

@inproceedings{SunCGL19,
  title = {Patient Knowledge Distillation for BERT Model Compression},
  author = {Siqi Sun and Yu Cheng and Zhe Gan and Jingjing Liu},
  year = {2019},
  doi = {10.18653/v1/D19-1441},
  url = {https://doi.org/10.18653/v1/D19-1441},
  researchr = {https://researchr.org/publication/SunCGL19},
  cites = {0},
  citedby = {0},
  pages = {4322-4331},
  booktitle = {Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China, November 3-7, 2019},
  editor = {Kentaro Inui and Jing Jiang and Vincent Ng and Xiaojun Wan 0001},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-950737-90-1},
}