LogicNMR: Probing the Non-monotonic Reasoning Ability of Pre-trained Language Models

Yeliang Xiu, Zhanhao Xiao, Yongmei Liu 0001. LogicNMR: Probing the Non-monotonic Reasoning Ability of Pre-trained Language Models. In Yoav Goldberg, Zornitsa Kozareva, Yue Zhang, editors, Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022. pages 3616-3626, Association for Computational Linguistics, 2022. [doi]

@inproceedings{XiuX022,
  title = {LogicNMR: Probing the Non-monotonic Reasoning Ability of Pre-trained Language Models},
  author = {Yeliang Xiu and Zhanhao Xiao and Yongmei Liu 0001},
  year = {2022},
  url = {https://aclanthology.org/2022.findings-emnlp.265},
  researchr = {https://researchr.org/publication/XiuX022},
  cites = {0},
  citedby = {0},
  pages = {3616-3626},
  booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022},
  editor = {Yoav Goldberg and Zornitsa Kozareva and Yue Zhang},
  publisher = {Association for Computational Linguistics},
}