Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees

Jiangang Bai, Yujing Wang, Yiren Chen, Yaming Yang, Jing Bai, Jing Yu, Yunhai Tong. Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees. In Paola Merlo, Jörg Tiedemann, Reut Tsarfaty, editors, Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, EACL 2021, Online, April 19 - 23, 2021. pages 3011-3020, Association for Computational Linguistics, 2021. [doi]

@inproceedings{BaiWCYBYT21,
  title = {Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees},
  author = {Jiangang Bai and Yujing Wang and Yiren Chen and Yaming Yang and Jing Bai and Jing Yu and Yunhai Tong},
  year = {2021},
  url = {https://www.aclweb.org/anthology/2021.eacl-main.262/},
  researchr = {https://researchr.org/publication/BaiWCYBYT21},
  cites = {0},
  citedby = {0},
  pages = {3011-3020},
  booktitle = {Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, EACL 2021, Online, April 19 - 23, 2021},
  editor = {Paola Merlo and Jörg Tiedemann and Reut Tsarfaty},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-954085-02-2},
}