LogicST: A Logical Self-Training Framework for Document-Level Relation Extraction with Incomplete Annotations

Shengda Fan, Yanting Wang, Shasha Mo, Jianwei Niu 0002. LogicST: A Logical Self-Training Framework for Document-Level Relation Extraction with Incomplete Annotations. In Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen, editors, Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024, Miami, FL, USA, November 12-16, 2024. pages 5496-5510, Association for Computational Linguistics, 2024. [doi]

Abstract

Abstract is missing.