LogicNMR: Probing the Non-monotonic Reasoning Ability of Pre-trained Language Models

Yeliang Xiu, Zhanhao Xiao, Yongmei Liu 0001. LogicNMR: Probing the Non-monotonic Reasoning Ability of Pre-trained Language Models. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 3616-3626, Association for Computational Linguistics, 2022. [doi]

Abstract

Abstract is missing.