XLM-E: Cross-lingual Language Model Pre-training via ELECTRA

Zewen Chi, Shaohan Huang, Li Dong 0004, Shuming Ma, Bo Zheng, Saksham Singhal, Payal Bajaj, Xia Song, Xian-Ling Mao, Heyan Huang, Furu Wei. XLM-E: Cross-lingual Language Model Pre-training via ELECTRA. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 6170-6182, Association for Computational Linguistics, 2022. [doi]

@inproceedings{ChiH0MZSBSMHW22,
  title = {XLM-E: Cross-lingual Language Model Pre-training via ELECTRA},
  author = {Zewen Chi and Shaohan Huang and Li Dong 0004 and Shuming Ma and Bo Zheng and Saksham Singhal and Payal Bajaj and Xia Song and Xian-Ling Mao and Heyan Huang and Furu Wei},
  year = {2022},
  url = {https://aclanthology.org/2022.acl-long.427},
  researchr = {https://researchr.org/publication/ChiH0MZSBSMHW22},
  cites = {0},
  citedby = {0},
  pages = {6170-6182},
  booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022},
  editor = {Smaranda Muresan and Preslav Nakov and Aline Villavicencio},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-21-6},
}