SMedBERT: A Knowledge-Enhanced Pre-trained Language Model with Structured Semantics for Medical Text Mining

Taolin Zhang, Zerui Cai, Chengyu Wang 0001, Minghui Qiu, Bite Yang, Xiaofeng He. SMedBERT: A Knowledge-Enhanced Pre-trained Language Model with Structured Semantics for Medical Text Mining. In Chengqing Zong, Fei Xia, Wenjie Li 0002, Roberto Navigli, editors, Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL/IJCNLP 2021, (Volume 1: Long Papers), Virtual Event, August 1-6, 2021. pages 5882-5893, Association for Computational Linguistics, 2021. [doi]

Authors

Taolin Zhang

This author has not been identified. Look up 'Taolin Zhang' in Google

Zerui Cai

This author has not been identified. Look up 'Zerui Cai' in Google

Chengyu Wang 0001

This author has not been identified. Look up 'Chengyu Wang 0001' in Google

Minghui Qiu

This author has not been identified. Look up 'Minghui Qiu' in Google

Bite Yang

This author has not been identified. Look up 'Bite Yang' in Google

Xiaofeng He

This author has not been identified. Look up 'Xiaofeng He' in Google