The following publications are possibly variants of this publication:
- Prompt Tuning in Biomedical Relation ExtractionJianping He 0002, Fang Li, Jianfu Li, Xinyue Hu, Yi Nian, Yang Xiang 0003, Jingqi Wang, Qiang Wei, Yiming Li, Hua Xu 0001, Cui Tao. jhir, 8(2):206-224, June 2024. [doi]
- SE-Prompt: Exploring Semantic Enhancement with Prompt Tuning for Relation ExtractionCai Wang, Dongyang Li, Xiaofeng He. adma 2023: 109-122 [doi]
- Relation Extraction with Knowledge-Enhanced Prompt-Tuning on Multimodal Knowledge GraphYan Ming, Yong Shang, Huiting Li. SMC 2023: 460-465 [doi]
- Relation Extraction as Open-book Examination: Retrieval-enhanced Prompt TuningXiang Chen 0016, Lei Li, Ningyu Zhang, Chuanqi Tan, Fei Huang, Luo Si, Huajun Chen. sigir 2022: 2443-2448 [doi]
- Retrieval-Enhanced Event Temporal Relation Extraction by Prompt TuningRong Luo, Po Hu. apweb 2024: 16-30 [doi]
- Prompt Tuning for Discriminative Pre-trained Language ModelsYuan Yao, Bowen Dong, Ao Zhang, Zhengyan Zhang, Ruobing Xie, Zhiyuan Liu, Leyu Lin, Maosong Sun, Jianyong Wang 0001. acl 2022: 3468-3473 [doi]
- A general approach for improving deep learning-based medical relation extraction using a pre-trained model and fine-tuningTao Chen, Mingfen Wu, Hexi Li. biodb, 2019, 2019. [doi]
- Virtual prompt pre-training for prototype-based few-shot relation extractionKai He, Yucheng Huang, Rui Mao 0010, Tieliang Gong, Chen Li 0011, Erik Cambria. eswa, 213(Part):118927, 2023. [doi]