Learning and Evaluating a Differentially Private Pre-trained Language Model

Shlomo Hoory, Amir Feder, Avichai Tendler, Sofia Erell, Alon Peled-Cohen, Itay Laish, Hootan Nakhost, Uri Stemmer, Ayelet Benjamini, Avinatan Hassidim, Yossi Matias. Learning and Evaluating a Differentially Private Pre-trained Language Model. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Findings of the Association for Computational Linguistics: EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 16-20 November, 2021. pages 1178-1189, Association for Computational Linguistics, 2021. [doi]

@inproceedings{HooryFTEPLNSBHM21,
  title = {Learning and Evaluating a Differentially Private Pre-trained Language Model},
  author = {Shlomo Hoory and Amir Feder and Avichai Tendler and Sofia Erell and Alon Peled-Cohen and Itay Laish and Hootan Nakhost and Uri Stemmer and Ayelet Benjamini and Avinatan Hassidim and Yossi Matias},
  year = {2021},
  url = {https://aclanthology.org/2021.findings-emnlp.102},
  researchr = {https://researchr.org/publication/HooryFTEPLNSBHM21},
  cites = {0},
  citedby = {0},
  pages = {1178-1189},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 16-20 November, 2021},
  editor = {Marie-Francine Moens and Xuanjing Huang and Lucia Specia and Scott Wen-tau Yih},
  publisher = {Association for Computational Linguistics},
}