Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation

Xinyi Wang, Sebastian Ruder, Graham Neubig. Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation. In Smaranda Muresan, Preslav Nakov, Aline Villavicencio, editors, Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022. pages 863-877, Association for Computational Linguistics, 2022. [doi]

@inproceedings{WangRN22,
  title = {Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation},
  author = {Xinyi Wang and Sebastian Ruder and Graham Neubig},
  year = {2022},
  url = {https://aclanthology.org/2022.acl-long.61},
  researchr = {https://researchr.org/publication/WangRN22},
  cites = {0},
  citedby = {0},
  pages = {863-877},
  booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), ACL 2022, Dublin, Ireland, May 22-27, 2022},
  editor = {Smaranda Muresan and Preslav Nakov and Aline Villavicencio},
  publisher = {Association for Computational Linguistics},
  isbn = {978-1-955917-21-6},
}