The following publications are possibly variants of this publication:
- Build a Good Human-Free Prompt Tuning: Jointly Pre-Trained Template and Verbalizer for Few-Shot ClassificationMouxiang Chen, Han Fu, Chenghao Liu, Xiaoyun Joy Wang, Zhuo Li, Jianling Sun. tkde, 37(5):2253-2265, May 2025. [doi]
- Pruned Contrastive Learning Verbalizer for Prompt-based Few-shot Text ClassificationJiaqi Zhao, Rongheng Lin, Baigen Wang, Ou Wang, Qian Zhao, Huizhou Liu. aibdf 2023: 370-377 [doi]
- Partial-Tuning Based Mixed-Modal Prototypes for Few-Shot ClassificationYuling Su, Xueliang Liu, Ye Zhao 0001, Richang Hong, Meng Wang 0001. tmm, 26:9175-9186, 2024. [doi]
- Prompt-Based Few-Shot Text Classification with Multi-Granularity Label Augmentation and Adaptive VerbalizerDeling Huang, Zanxiong Li, Jian Yu 0002, Yulong Zhou. information, 17(1):58, 2026. [doi]
- Knowledgeable Prompt-tuning: Incorporating Knowledge into Prompt Verbalizer for Text ClassificationShengding Hu, Ning Ding, Huadong Wang, Zhiyuan Liu, Jingang Wang, Juanzi Li, Wei Wu, Maosong Sun. acl 2022: 2225-2240 [doi]
- Virtual prompt pre-training for prototype-based few-shot relation extractionKai He, Yucheng Huang, Rui Mao 0010, Tieliang Gong, Chen Li 0011, Erik Cambria. eswa, 213(Part):118927, 2023. [doi]
- A novel prompt-tuning method: Incorporating scenario-specific concepts into a verbalizerYong Ma, Senlin Luo, Yu-Ming Shang, Zhengjun Li, Yong Liu. eswa, 247:123204, 2024. [doi]