Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization

Haode Zhang, Haowen Liang, Yuwei Zhang, Li-Ming Zhan, Xiao-Ming Wu 0003, Xiaolei Lu, Albert Y. S. Lam. Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization. In Marine Carpuat, Marie-Catherine de Marneffe, Iván Vladimir Meza Ruíz, editors, Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2022, Seattle, WA, United States, July 10-15, 2022. pages 532-542, Association for Computational Linguistics, 2022. [doi]

@inproceedings{ZhangLZZ0LL22,
  title = {Fine-tuning Pre-trained Language Models for Few-shot Intent Detection: Supervised Pre-training and Isotropization},
  author = {Haode Zhang and Haowen Liang and Yuwei Zhang and Li-Ming Zhan and Xiao-Ming Wu 0003 and Xiaolei Lu and Albert Y. S. Lam},
  year = {2022},
  url = {https://aclanthology.org/2022.naacl-main.39},
  researchr = {https://researchr.org/publication/ZhangLZZ0LL22},
  cites = {0},
  citedby = {0},
  pages = {532-542},
  booktitle = {Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2022, Seattle, WA, United States, July 10-15, 2022},
  editor = {Marine Carpuat and Marie-Catherine de Marneffe and Iván Vladimir Meza Ruíz},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-71-1},
}