Beyond prompting: Making Pre-trained Language Models Better Zero-shot Learners by Clustering Representations

Yu Fei, Zhao Meng, Ping Nie, Roger Wattenhofer, Mrinmaya Sachan. Beyond prompting: Making Pre-trained Language Models Better Zero-shot Learners by Clustering Representations. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11. pages 8560-8579, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.