Private Model Compression via Knowledge Distillation

Ji Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S. Yu. Private Model Compression via Knowledge Distillation. In The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Thirty-First Innovative Applications of Artificial Intelligence Conference, IAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, Honolulu, Hawaii, USA, January 27 - February 1, 2019. pages 1190-1197, AAAI Press, 2019. [doi]

@inproceedings{WangBSZCY19,
  title = {Private Model Compression via Knowledge Distillation},
  author = {Ji Wang and Weidong Bao and Lichao Sun and Xiaomin Zhu and Bokai Cao and Philip S. Yu},
  year = {2019},
  url = {https://aaai.org/ojs/index.php/AAAI/article/view/3913},
  researchr = {https://researchr.org/publication/WangBSZCY19},
  cites = {0},
  citedby = {0},
  pages = {1190-1197},
  booktitle = {The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI 2019, The Thirty-First Innovative Applications of Artificial Intelligence Conference, IAAI 2019, The Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, Honolulu, Hawaii, USA, January 27 - February 1, 2019},
  publisher = {AAAI Press},
  isbn = {978-1-57735-809-1},
}