ConvFiT: Conversational Fine-Tuning of Pretrained Language Models

Ivan Vulic, Pei-hao Su, Samuel Coope, Daniela Gerz, Pawel Budzianowski, Iñigo Casanueva, Nikola Mrksic, Tsung-Hsien Wen. ConvFiT: Conversational Fine-Tuning of Pretrained Language Models. In Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih, editors, Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021. pages 1151-1168, Association for Computational Linguistics, 2021. [doi]

@inproceedings{VulicSCGBCMW21,
  title = {ConvFiT: Conversational Fine-Tuning of Pretrained Language Models},
  author = {Ivan Vulic and Pei-hao Su and Samuel Coope and Daniela Gerz and Pawel Budzianowski and Iñigo Casanueva and Nikola Mrksic and Tsung-Hsien Wen},
  year = {2021},
  url = {https://aclanthology.org/2021.emnlp-main.88},
  researchr = {https://researchr.org/publication/VulicSCGBCMW21},
  cites = {0},
  citedby = {0},
  pages = {1151-1168},
  booktitle = {Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021},
  editor = {Marie-Francine Moens and Xuanjing Huang and Lucia Specia and Scott Wen-tau Yih},
  publisher = {Association for Computational Linguistics},
}