The following publications are possibly variants of this publication:
- Attention-Guided Answer Distillation for Machine Reading ComprehensionMinghao Hu, Yuxing Peng, Furu Wei, Zhen Huang, Dongsheng Li, Nan Yang, Ming Zhou. emnlp 2018: 2077-2086 [doi]
- S-Net: From Answer Extraction to Answer Synthesis for Machine Reading ComprehensionChuanqi Tan, Furu Wei, Nan Yang, Bowen Du, Weifeng Lv, Ming Zhou. AAAI 2018: 5940-5947 [doi]
- SG-Net: Syntax Guided Transformer for Language RepresentationZhuosheng Zhang 0001, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang 0015. pami, 44(6):3285-3299, 2022. [doi]
- Cross-Lingual Machine Reading ComprehensionYiming Cui, Wanxiang Che, Ting Liu 0001, Bing Qin 0001, Shijin Wang, Guoping Hu. emnlp 2019: 1586-1595 [doi]
- Retrospective Reader for Machine Reading ComprehensionZhuosheng Zhang 0001, Junjie Yang, Hai Zhao. AAAI 2021: 14506-14514 [doi]
- Event Extraction as Machine Reading ComprehensionJian Liu, Yubo Chen 0001, Kang Liu 0001, Wei Bi, Xiaojiang Liu. emnlp 2020: 1641-1651 [doi]