ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding

Dongling Xiao, Yu-Kun Li, Han Zhang, Yu Sun, Hao Tian, Hua Wu 0003, Haifeng Wang 0001. ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding. In Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tür, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty 0002, Yichao Zhou, editors, Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021. pages 1702-1715, Association for Computational Linguistics, 2021. [doi]

@inproceedings{XiaoLZSTWW21,
  title = {ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding},
  author = {Dongling Xiao and Yu-Kun Li and Han Zhang and Yu Sun and Hao Tian and Hua Wu 0003 and Haifeng Wang 0001},
  year = {2021},
  url = {https://www.aclweb.org/anthology/2021.naacl-main.136/},
  researchr = {https://researchr.org/publication/XiaoLZSTWW21},
  cites = {0},
  citedby = {0},
  pages = {1702-1715},
  booktitle = {Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021},
  editor = {Kristina Toutanova and Anna Rumshisky and Luke Zettlemoyer and Dilek Hakkani-Tür and Iz Beltagy and Steven Bethard and Ryan Cotterell and Tanmoy Chakraborty 0002 and Yichao Zhou},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-954085-46-6},
}