On the utility of enhancing BERT syntactic bias with Token Reordering Pretraining

Yassir El Mesbahi, Atif Mahmud, Abbas Ghaddar, Mehdi Rezagholizadeh, Philippe Langlais, Prasanna Parthasarathi. On the utility of enhancing BERT syntactic bias with Token Reordering Pretraining. In Jing Jiang 0001, David Reitter, Shumin Deng, editors, Proceedings of the 27th Conference on Computational Natural Language Learning, CoNLL 2023, Singapore, December 6-7, 2023. pages 165-182, Association for Computational Linguistics, 2023. [doi]

Abstract

Abstract is missing.