The following publications are possibly variants of this publication:
- Differentiable Prompt Makes Pre-trained Language Models Better Few-shot LearnersNingyu Zhang, Luoqiu Li, Xiang Chen, Shumin Deng, Zhen Bi, Chuanqi Tan, Fei Huang, Huajun Chen. iclr 2022: [doi]
- Synthesize, Prompt and Transfer: Zero-shot Conversational Question Generation with Pre-trained Language ModelHongwei Zeng, Bifan Wei, Jun Liu, Weiping Fu. acl 2023: 8989-9010 [doi]
- Pre-trained Language Model with Prompts for Temporal Knowledge Graph CompletionWenjie Xu, Ben Liu, Miao Peng, Xu Jia, Min Peng. acl 2023: 7790-7803 [doi]
- Knowledge Prompting in Pre-trained Language Model for Natural Language UnderstandingJianing Wang, Wenkang Huang, Minghui Qiu, Qiuhui Shi, Hongbin Wang, Xiang Li, Ming Gao. emnlp 2022: 3164-3177 [doi]
- MultiCapCLIP: Auto-Encoding Prompts for Zero-Shot Multilingual Visual CaptioningBang Yang, Fenglin Liu, Xian Wu, Yaowei Wang, Xu Sun 0001, Yuexian Zou. acl 2023: 11908-11922 [doi]